Oct 02 06:46:29 crc systemd[1]: Starting Kubernetes Kubelet... Oct 02 06:46:29 crc restorecon[4548]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 06:46:29 crc restorecon[4548]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 06:46:29 crc restorecon[4548]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 02 06:46:30 crc kubenswrapper[4786]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 06:46:30 crc kubenswrapper[4786]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 02 06:46:30 crc kubenswrapper[4786]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 06:46:30 crc kubenswrapper[4786]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 06:46:30 crc kubenswrapper[4786]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 02 06:46:30 crc kubenswrapper[4786]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.056383 4786 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059426 4786 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059458 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059463 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059468 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059473 4786 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059480 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059484 4786 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059488 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059491 4786 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059494 4786 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059499 4786 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059502 4786 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059505 4786 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059509 4786 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059512 4786 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059515 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059518 4786 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059521 4786 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059524 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059528 4786 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059531 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059534 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059543 4786 feature_gate.go:330] unrecognized feature gate: Example Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059548 4786 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059552 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059556 4786 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059559 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059562 4786 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059565 4786 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059569 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059572 4786 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059575 4786 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059578 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059581 4786 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059585 4786 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059588 4786 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059591 4786 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059594 4786 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059598 4786 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059601 4786 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059605 4786 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059608 4786 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059611 4786 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059614 4786 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059618 4786 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059622 4786 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059625 4786 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059629 4786 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059633 4786 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059636 4786 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059639 4786 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059643 4786 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059648 4786 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059651 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059655 4786 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059660 4786 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059664 4786 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059668 4786 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059671 4786 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059675 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059678 4786 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059681 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059684 4786 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059701 4786 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059705 4786 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059708 4786 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059712 4786 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059715 4786 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059718 4786 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059722 4786 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.059726 4786 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060183 4786 flags.go:64] FLAG: --address="0.0.0.0" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060197 4786 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060205 4786 flags.go:64] FLAG: --anonymous-auth="true" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060211 4786 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060216 4786 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060220 4786 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060225 4786 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060230 4786 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060234 4786 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060238 4786 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060242 4786 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060252 4786 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060256 4786 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060260 4786 flags.go:64] FLAG: --cgroup-root="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060263 4786 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060267 4786 flags.go:64] FLAG: --client-ca-file="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060271 4786 flags.go:64] FLAG: --cloud-config="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060274 4786 flags.go:64] FLAG: --cloud-provider="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060278 4786 flags.go:64] FLAG: --cluster-dns="[]" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060282 4786 flags.go:64] FLAG: --cluster-domain="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060286 4786 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060290 4786 flags.go:64] FLAG: --config-dir="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060293 4786 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060297 4786 flags.go:64] FLAG: --container-log-max-files="5" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060302 4786 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060306 4786 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060311 4786 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060315 4786 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060319 4786 flags.go:64] FLAG: --contention-profiling="false" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060322 4786 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060326 4786 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060331 4786 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060334 4786 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060339 4786 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060343 4786 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060347 4786 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060351 4786 flags.go:64] FLAG: --enable-load-reader="false" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060355 4786 flags.go:64] FLAG: --enable-server="true" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060358 4786 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060365 4786 flags.go:64] FLAG: --event-burst="100" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060368 4786 flags.go:64] FLAG: --event-qps="50" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060372 4786 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060376 4786 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060381 4786 flags.go:64] FLAG: --eviction-hard="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060386 4786 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060390 4786 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060394 4786 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060399 4786 flags.go:64] FLAG: --eviction-soft="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060403 4786 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060406 4786 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060410 4786 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060413 4786 flags.go:64] FLAG: --experimental-mounter-path="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060417 4786 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060421 4786 flags.go:64] FLAG: --fail-swap-on="true" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060424 4786 flags.go:64] FLAG: --feature-gates="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060429 4786 flags.go:64] FLAG: --file-check-frequency="20s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060432 4786 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060436 4786 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060440 4786 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060444 4786 flags.go:64] FLAG: --healthz-port="10248" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060447 4786 flags.go:64] FLAG: --help="false" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060451 4786 flags.go:64] FLAG: --hostname-override="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060454 4786 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060458 4786 flags.go:64] FLAG: --http-check-frequency="20s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060463 4786 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060466 4786 flags.go:64] FLAG: --image-credential-provider-config="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060470 4786 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060474 4786 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060477 4786 flags.go:64] FLAG: --image-service-endpoint="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060481 4786 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060484 4786 flags.go:64] FLAG: --kube-api-burst="100" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060488 4786 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060492 4786 flags.go:64] FLAG: --kube-api-qps="50" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060496 4786 flags.go:64] FLAG: --kube-reserved="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060500 4786 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060503 4786 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060507 4786 flags.go:64] FLAG: --kubelet-cgroups="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060511 4786 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060515 4786 flags.go:64] FLAG: --lock-file="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060518 4786 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060522 4786 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060526 4786 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060531 4786 flags.go:64] FLAG: --log-json-split-stream="false" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060535 4786 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060539 4786 flags.go:64] FLAG: --log-text-split-stream="false" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060543 4786 flags.go:64] FLAG: --logging-format="text" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060546 4786 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060550 4786 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060554 4786 flags.go:64] FLAG: --manifest-url="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060558 4786 flags.go:64] FLAG: --manifest-url-header="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060566 4786 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060570 4786 flags.go:64] FLAG: --max-open-files="1000000" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060574 4786 flags.go:64] FLAG: --max-pods="110" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060578 4786 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060582 4786 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060587 4786 flags.go:64] FLAG: --memory-manager-policy="None" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060590 4786 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060594 4786 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060597 4786 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060601 4786 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060610 4786 flags.go:64] FLAG: --node-status-max-images="50" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060614 4786 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060618 4786 flags.go:64] FLAG: --oom-score-adj="-999" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060622 4786 flags.go:64] FLAG: --pod-cidr="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060626 4786 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060632 4786 flags.go:64] FLAG: --pod-manifest-path="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060635 4786 flags.go:64] FLAG: --pod-max-pids="-1" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060640 4786 flags.go:64] FLAG: --pods-per-core="0" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060643 4786 flags.go:64] FLAG: --port="10250" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060647 4786 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060651 4786 flags.go:64] FLAG: --provider-id="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060654 4786 flags.go:64] FLAG: --qos-reserved="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060658 4786 flags.go:64] FLAG: --read-only-port="10255" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060662 4786 flags.go:64] FLAG: --register-node="true" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060666 4786 flags.go:64] FLAG: --register-schedulable="true" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060669 4786 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060676 4786 flags.go:64] FLAG: --registry-burst="10" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060679 4786 flags.go:64] FLAG: --registry-qps="5" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060683 4786 flags.go:64] FLAG: --reserved-cpus="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060706 4786 flags.go:64] FLAG: --reserved-memory="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060712 4786 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060716 4786 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060720 4786 flags.go:64] FLAG: --rotate-certificates="false" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060724 4786 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060727 4786 flags.go:64] FLAG: --runonce="false" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060731 4786 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060735 4786 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060739 4786 flags.go:64] FLAG: --seccomp-default="false" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060744 4786 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060748 4786 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060752 4786 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060756 4786 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060760 4786 flags.go:64] FLAG: --storage-driver-password="root" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060764 4786 flags.go:64] FLAG: --storage-driver-secure="false" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060768 4786 flags.go:64] FLAG: --storage-driver-table="stats" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060772 4786 flags.go:64] FLAG: --storage-driver-user="root" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060775 4786 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060779 4786 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060784 4786 flags.go:64] FLAG: --system-cgroups="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060787 4786 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060794 4786 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060798 4786 flags.go:64] FLAG: --tls-cert-file="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060802 4786 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060806 4786 flags.go:64] FLAG: --tls-min-version="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060810 4786 flags.go:64] FLAG: --tls-private-key-file="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060814 4786 flags.go:64] FLAG: --topology-manager-policy="none" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060818 4786 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060822 4786 flags.go:64] FLAG: --topology-manager-scope="container" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060825 4786 flags.go:64] FLAG: --v="2" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060830 4786 flags.go:64] FLAG: --version="false" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060835 4786 flags.go:64] FLAG: --vmodule="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060840 4786 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.060844 4786 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.060944 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.060948 4786 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.060952 4786 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.060955 4786 feature_gate.go:330] unrecognized feature gate: Example Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.060959 4786 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.060962 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.060966 4786 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.060969 4786 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.060972 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.060976 4786 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.060979 4786 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.060982 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.060986 4786 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.060991 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.060995 4786 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.060999 4786 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061003 4786 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061006 4786 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061010 4786 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061013 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061017 4786 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061020 4786 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061023 4786 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061026 4786 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061030 4786 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061033 4786 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061036 4786 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061040 4786 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061045 4786 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061049 4786 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061052 4786 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061056 4786 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061061 4786 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061064 4786 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061067 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061071 4786 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061074 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061077 4786 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061081 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061085 4786 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061089 4786 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061092 4786 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061095 4786 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061105 4786 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061109 4786 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061112 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061115 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061118 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061122 4786 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061125 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061128 4786 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061132 4786 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061136 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061139 4786 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061142 4786 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061146 4786 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061149 4786 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061162 4786 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061165 4786 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061169 4786 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061172 4786 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061176 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061179 4786 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061183 4786 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061187 4786 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061190 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061194 4786 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061197 4786 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061201 4786 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061204 4786 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.061207 4786 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.061224 4786 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.067163 4786 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.067191 4786 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067253 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067261 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067266 4786 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067270 4786 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067274 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067278 4786 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067281 4786 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067285 4786 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067288 4786 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067294 4786 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067299 4786 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067303 4786 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067307 4786 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067312 4786 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067317 4786 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067322 4786 feature_gate.go:330] unrecognized feature gate: Example Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067326 4786 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067331 4786 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067335 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067338 4786 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067342 4786 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067345 4786 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067349 4786 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067353 4786 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067356 4786 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067360 4786 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067363 4786 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067366 4786 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067369 4786 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067373 4786 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067376 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067380 4786 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067384 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067387 4786 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067391 4786 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067395 4786 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067398 4786 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067402 4786 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067405 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067408 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067411 4786 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067414 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067417 4786 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067421 4786 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067424 4786 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067428 4786 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067431 4786 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067436 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067439 4786 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067443 4786 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067446 4786 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067449 4786 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067453 4786 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067456 4786 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067459 4786 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067462 4786 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067465 4786 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067479 4786 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067482 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067485 4786 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067488 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067492 4786 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067495 4786 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067498 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067501 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067504 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067509 4786 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067512 4786 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067517 4786 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067520 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067523 4786 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.067530 4786 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067635 4786 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067643 4786 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067646 4786 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067650 4786 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067654 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067657 4786 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067660 4786 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067665 4786 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067670 4786 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067673 4786 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067676 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067679 4786 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067683 4786 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067700 4786 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067703 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067706 4786 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067710 4786 feature_gate.go:330] unrecognized feature gate: Example Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067713 4786 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067716 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067721 4786 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067724 4786 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067727 4786 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067730 4786 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067733 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067738 4786 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067742 4786 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067746 4786 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067750 4786 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067753 4786 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067757 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067760 4786 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067763 4786 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067767 4786 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067770 4786 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067773 4786 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067777 4786 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067780 4786 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067784 4786 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067787 4786 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067791 4786 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067796 4786 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067800 4786 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067803 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067806 4786 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067809 4786 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067813 4786 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067817 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067821 4786 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067825 4786 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067828 4786 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067831 4786 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067834 4786 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067838 4786 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067841 4786 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067844 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067847 4786 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067851 4786 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067854 4786 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067857 4786 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067861 4786 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067865 4786 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067869 4786 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067872 4786 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067875 4786 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067878 4786 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067881 4786 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067885 4786 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067888 4786 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067892 4786 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067895 4786 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.067899 4786 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.067905 4786 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.068031 4786 server.go:940] "Client rotation is on, will bootstrap in background" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.070750 4786 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.070822 4786 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.071480 4786 server.go:997] "Starting client certificate rotation" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.071507 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.072390 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-20 16:00:49.402384342 +0000 UTC Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.072452 4786 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1185h14m19.329934071s for next certificate rotation Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.083181 4786 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.086561 4786 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.094781 4786 log.go:25] "Validated CRI v1 runtime API" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.110965 4786 log.go:25] "Validated CRI v1 image API" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.112121 4786 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.115779 4786 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-02-06-43-28-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.115803 4786 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm:{mountpoint:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:50 fsType:tmpfs blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/94b752e0a51c0134b00ddef6dc7a933a9d7c1d9bdc88a18dae4192a0d557d623/merged major:0 minor:43 fsType:overlay blockSize:0}] Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.129257 4786 manager.go:217] Machine: {Timestamp:2025-10-02 06:46:30.127622692 +0000 UTC m=+0.248805843 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2445406 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:21a64352-c064-4ec3-ab53-f6b24546cab3 BootID:32255bd5-d69e-4921-8286-62a7cb990a56 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/var/lib/containers/storage/overlay-containers/75d81934760b26101869fbd8e4b5954c62b019c1cc3e5a0c9f82ed8de46b3b22/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:65536000 Type:vfs Inodes:4108168 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:50 Capacity:1073741824 Type:vfs Inodes:4108168 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:2b:01:1e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:enp3s0 MacAddress:fa:16:3e:2b:01:1e Speed:-1 Mtu:1500} {Name:enp7s0 MacAddress:fa:16:3e:f6:09:04 Speed:-1 Mtu:1440} {Name:enp7s0.20 MacAddress:52:54:00:95:c6:91 Speed:-1 Mtu:1436} {Name:enp7s0.21 MacAddress:52:54:00:29:dd:13 Speed:-1 Mtu:1436} {Name:enp7s0.22 MacAddress:52:54:00:38:a9:1b Speed:-1 Mtu:1436} {Name:eth10 MacAddress:ea:03:88:bb:cf:f3 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:42:ca:15:66:fd:2a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:65536 Type:Data Level:1} {Id:0 Size:65536 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:65536 Type:Data Level:1} {Id:1 Size:65536 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:65536 Type:Data Level:1} {Id:10 Size:65536 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:65536 Type:Data Level:1} {Id:11 Size:65536 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:65536 Type:Data Level:1} {Id:2 Size:65536 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:65536 Type:Data Level:1} {Id:3 Size:65536 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:65536 Type:Data Level:1} {Id:4 Size:65536 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:65536 Type:Data Level:1} {Id:5 Size:65536 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:65536 Type:Data Level:1} {Id:6 Size:65536 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:65536 Type:Data Level:1} {Id:7 Size:65536 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:65536 Type:Data Level:1} {Id:8 Size:65536 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:65536 Type:Data Level:1} {Id:9 Size:65536 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.129411 4786 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.129494 4786 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.129722 4786 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.129860 4786 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.129881 4786 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.130027 4786 topology_manager.go:138] "Creating topology manager with none policy" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.130035 4786 container_manager_linux.go:303] "Creating device plugin manager" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.130432 4786 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.130456 4786 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.130782 4786 state_mem.go:36] "Initialized new in-memory state store" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.130847 4786 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.132347 4786 kubelet.go:418] "Attempting to sync node with API server" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.132363 4786 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.132381 4786 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.132390 4786 kubelet.go:324] "Adding apiserver pod source" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.132399 4786 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.134339 4786 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.134808 4786 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.135581 4786 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.136075 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.23:6443: connect: connection refused Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.136084 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.23:6443: connect: connection refused Oct 02 06:46:30 crc kubenswrapper[4786]: E1002 06:46:30.136173 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.23:6443: connect: connection refused" logger="UnhandledError" Oct 02 06:46:30 crc kubenswrapper[4786]: E1002 06:46:30.136176 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.23:6443: connect: connection refused" logger="UnhandledError" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.136375 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.136395 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.136401 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.136407 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.136416 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.136429 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.136435 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.136444 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.136451 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.136457 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.136470 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.136475 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.136990 4786 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.137292 4786 server.go:1280] "Started kubelet" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.137825 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.23:6443: connect: connection refused Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.137855 4786 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.137854 4786 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 02 06:46:30 crc systemd[1]: Started Kubernetes Kubelet. Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.138241 4786 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.142356 4786 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.142384 4786 server.go:460] "Adding debug handlers to kubelet server" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.142444 4786 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 02 06:46:30 crc kubenswrapper[4786]: E1002 06:46:30.141665 4786 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 192.168.26.23:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a99b10c5cd970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-02 06:46:30.137272688 +0000 UTC m=+0.258455829,LastTimestamp:2025-10-02 06:46:30.137272688 +0000 UTC m=+0.258455829,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.142481 4786 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.142490 4786 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.142445 4786 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 21:06:33.25913434 +0000 UTC Oct 02 06:46:30 crc kubenswrapper[4786]: E1002 06:46:30.142504 4786 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.142517 4786 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 2102h20m3.116622764s for next certificate rotation Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.142553 4786 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.145472 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.23:6443: connect: connection refused Oct 02 06:46:30 crc kubenswrapper[4786]: E1002 06:46:30.145529 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.23:6443: connect: connection refused" logger="UnhandledError" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.145669 4786 factory.go:55] Registering systemd factory Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.145704 4786 factory.go:221] Registration of the systemd container factory successfully Oct 02 06:46:30 crc kubenswrapper[4786]: E1002 06:46:30.145680 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.23:6443: connect: connection refused" interval="200ms" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.145946 4786 factory.go:153] Registering CRI-O factory Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.146033 4786 factory.go:221] Registration of the crio container factory successfully Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.146125 4786 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.146557 4786 factory.go:103] Registering Raw factory Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.146648 4786 manager.go:1196] Started watching for new ooms in manager Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.147279 4786 manager.go:319] Starting recovery of all containers Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.149766 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.149894 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.149906 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.149916 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.149924 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150053 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150066 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150074 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150084 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150091 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150100 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150209 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150218 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150227 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150235 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150244 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150252 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150354 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150367 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150375 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150384 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150402 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150411 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150517 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150534 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150552 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150564 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150667 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150685 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150819 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150830 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150838 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150846 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150854 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150862 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150887 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150895 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150903 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150911 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150919 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150926 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.150934 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151187 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151205 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151213 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151223 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151231 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151239 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151247 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151357 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151367 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151376 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151387 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151514 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151525 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151535 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151543 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151551 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151558 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151661 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151670 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151679 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151784 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151800 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151808 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151816 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151824 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151832 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151841 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151916 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151943 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151952 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151960 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151969 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151978 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151986 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.151993 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.152002 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.152026 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.152034 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.152041 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.152048 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.152056 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.152064 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153129 4786 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153163 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153176 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153185 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153193 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153201 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153209 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153217 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153226 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153235 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153243 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153251 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153260 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153267 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153275 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153283 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153290 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153306 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153313 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153342 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153350 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153388 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153398 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153408 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153427 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153458 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153467 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153475 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153484 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153492 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153500 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153508 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153516 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153530 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153538 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153546 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153554 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153561 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153581 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153588 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153596 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153610 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153617 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153624 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153632 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153640 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153655 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153662 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153669 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153712 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153722 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153730 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153737 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153744 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153751 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153759 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153766 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153783 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153792 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153800 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153827 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153836 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153843 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153851 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153857 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153872 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153887 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153895 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153903 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153910 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153917 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153924 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153945 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153960 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153969 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153976 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153983 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.153991 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154004 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154012 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154019 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154034 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154042 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154049 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154056 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154063 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154073 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154081 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154088 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154104 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154128 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154135 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154142 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154160 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154167 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154174 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154192 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154206 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154228 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154235 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154244 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154264 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154271 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154294 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154327 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154340 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154347 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154357 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154370 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154388 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154416 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154424 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154437 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154449 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154457 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154463 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154470 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154476 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154495 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154501 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154508 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154522 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154531 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154538 4786 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154545 4786 reconstruct.go:97] "Volume reconstruction finished" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.154551 4786 reconciler.go:26] "Reconciler: start to sync state" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.166779 4786 manager.go:324] Recovery completed Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.175369 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.176206 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.176240 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.176264 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.176499 4786 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.177011 4786 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.177030 4786 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.177047 4786 state_mem.go:36] "Initialized new in-memory state store" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.177991 4786 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.178025 4786 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.178045 4786 kubelet.go:2335] "Starting kubelet main sync loop" Oct 02 06:46:30 crc kubenswrapper[4786]: E1002 06:46:30.178080 4786 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.179034 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.23:6443: connect: connection refused Oct 02 06:46:30 crc kubenswrapper[4786]: E1002 06:46:30.179068 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.23:6443: connect: connection refused" logger="UnhandledError" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.181029 4786 policy_none.go:49] "None policy: Start" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.181590 4786 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.181662 4786 state_mem.go:35] "Initializing new in-memory state store" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.240501 4786 manager.go:334] "Starting Device Plugin manager" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.240559 4786 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.240571 4786 server.go:79] "Starting device plugin registration server" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.240893 4786 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.240926 4786 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.241094 4786 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.241171 4786 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.241182 4786 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 02 06:46:30 crc kubenswrapper[4786]: E1002 06:46:30.247163 4786 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.278737 4786 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.278785 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.279425 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.279450 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.279459 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.279545 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.279783 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.279815 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.280127 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.280156 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.280165 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.280241 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.280356 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.280390 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.280594 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.280622 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.280635 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.281057 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.281081 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.281085 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.281104 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.281115 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.281090 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.281234 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.281364 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.281390 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.281855 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.281885 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.281894 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.281941 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.281955 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.281963 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.281985 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.282052 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.282076 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.282711 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.282725 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.282733 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.282740 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.282755 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.282763 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.282823 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.282842 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.283337 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.283360 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.283368 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.341188 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.343868 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.343920 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.343929 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.343944 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 06:46:30 crc kubenswrapper[4786]: E1002 06:46:30.344303 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.23:6443: connect: connection refused" node="crc" Oct 02 06:46:30 crc kubenswrapper[4786]: E1002 06:46:30.346782 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.23:6443: connect: connection refused" interval="400ms" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.356172 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.356202 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.356220 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.356234 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.356249 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.356262 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.356299 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.356322 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.356335 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.356378 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.356422 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.356497 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.356521 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.356595 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.356610 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457325 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457359 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457378 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457393 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457408 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457421 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457435 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457448 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457464 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457478 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457491 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457504 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457502 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457521 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457535 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457568 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457571 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457600 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457606 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457629 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457636 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457579 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457501 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457576 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457636 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457668 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457660 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457601 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457618 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.457715 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.545421 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.546252 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.546290 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.546301 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.546324 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 06:46:30 crc kubenswrapper[4786]: E1002 06:46:30.546624 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.23:6443: connect: connection refused" node="crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.597512 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.603417 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.618393 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.619626 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-333b3c78870c84eec18fb7603cf8e6b3a47a0a2242324cd3686bea6a49f2a9aa WatchSource:0}: Error finding container 333b3c78870c84eec18fb7603cf8e6b3a47a0a2242324cd3686bea6a49f2a9aa: Status 404 returned error can't find the container with id 333b3c78870c84eec18fb7603cf8e6b3a47a0a2242324cd3686bea6a49f2a9aa Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.622125 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-5c3599e23ffc0b2918ebb8451288e56ffe1bd2f143b6828b1ec034f165782b7f WatchSource:0}: Error finding container 5c3599e23ffc0b2918ebb8451288e56ffe1bd2f143b6828b1ec034f165782b7f: Status 404 returned error can't find the container with id 5c3599e23ffc0b2918ebb8451288e56ffe1bd2f143b6828b1ec034f165782b7f Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.622661 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.627090 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.631469 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-7ba32a1e5ac1e49a1026f35eeacc3728a0d89a8c33d3bcb8a938832505a2a2f8 WatchSource:0}: Error finding container 7ba32a1e5ac1e49a1026f35eeacc3728a0d89a8c33d3bcb8a938832505a2a2f8: Status 404 returned error can't find the container with id 7ba32a1e5ac1e49a1026f35eeacc3728a0d89a8c33d3bcb8a938832505a2a2f8 Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.632338 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-cf877adb9952e3a6f9a357bfd7916179dabfc46e384276d32cb767a9af75edbc WatchSource:0}: Error finding container cf877adb9952e3a6f9a357bfd7916179dabfc46e384276d32cb767a9af75edbc: Status 404 returned error can't find the container with id cf877adb9952e3a6f9a357bfd7916179dabfc46e384276d32cb767a9af75edbc Oct 02 06:46:30 crc kubenswrapper[4786]: W1002 06:46:30.639866 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-a80d0e372874d7dac18847599e261469699ba76c421fbae0ae2b2a7017d9fb16 WatchSource:0}: Error finding container a80d0e372874d7dac18847599e261469699ba76c421fbae0ae2b2a7017d9fb16: Status 404 returned error can't find the container with id a80d0e372874d7dac18847599e261469699ba76c421fbae0ae2b2a7017d9fb16 Oct 02 06:46:30 crc kubenswrapper[4786]: E1002 06:46:30.747524 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.23:6443: connect: connection refused" interval="800ms" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.946725 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.947514 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.947544 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.947553 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:30 crc kubenswrapper[4786]: I1002 06:46:30.947570 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 06:46:30 crc kubenswrapper[4786]: E1002 06:46:30.948030 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.23:6443: connect: connection refused" node="crc" Oct 02 06:46:31 crc kubenswrapper[4786]: W1002 06:46:31.015996 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.26.23:6443: connect: connection refused Oct 02 06:46:31 crc kubenswrapper[4786]: E1002 06:46:31.016220 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.26.23:6443: connect: connection refused" logger="UnhandledError" Oct 02 06:46:31 crc kubenswrapper[4786]: W1002 06:46:31.081513 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 192.168.26.23:6443: connect: connection refused Oct 02 06:46:31 crc kubenswrapper[4786]: E1002 06:46:31.081583 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 192.168.26.23:6443: connect: connection refused" logger="UnhandledError" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.138526 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 192.168.26.23:6443: connect: connection refused Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.182441 4786 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887" exitCode=0 Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.182518 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887"} Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.182590 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"333b3c78870c84eec18fb7603cf8e6b3a47a0a2242324cd3686bea6a49f2a9aa"} Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.182682 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.185750 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.185779 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.185788 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.185865 4786 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="2ca981336ba9c89b6bad8fc5575ffc2c952224eb8110900cf22038176e2186e1" exitCode=0 Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.185938 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"2ca981336ba9c89b6bad8fc5575ffc2c952224eb8110900cf22038176e2186e1"} Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.185964 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5c3599e23ffc0b2918ebb8451288e56ffe1bd2f143b6828b1ec034f165782b7f"} Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.186026 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.186784 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.186809 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.186818 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.187449 4786 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738" exitCode=0 Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.187508 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738"} Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.187529 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a80d0e372874d7dac18847599e261469699ba76c421fbae0ae2b2a7017d9fb16"} Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.187585 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.188223 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.188258 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.188267 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.189427 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399"} Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.189460 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cf877adb9952e3a6f9a357bfd7916179dabfc46e384276d32cb767a9af75edbc"} Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.190521 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d" exitCode=0 Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.190566 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d"} Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.190597 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7ba32a1e5ac1e49a1026f35eeacc3728a0d89a8c33d3bcb8a938832505a2a2f8"} Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.190682 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.191392 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.191419 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.191429 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.196809 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.197524 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.197549 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.197558 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:31 crc kubenswrapper[4786]: W1002 06:46:31.374278 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.26.23:6443: connect: connection refused Oct 02 06:46:31 crc kubenswrapper[4786]: E1002 06:46:31.374359 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.26.23:6443: connect: connection refused" logger="UnhandledError" Oct 02 06:46:31 crc kubenswrapper[4786]: E1002 06:46:31.548024 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 192.168.26.23:6443: connect: connection refused" interval="1.6s" Oct 02 06:46:31 crc kubenswrapper[4786]: W1002 06:46:31.568516 4786 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.26.23:6443: connect: connection refused Oct 02 06:46:31 crc kubenswrapper[4786]: E1002 06:46:31.568579 4786 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.26.23:6443: connect: connection refused" logger="UnhandledError" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.748962 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.749842 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.749875 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.749884 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:31 crc kubenswrapper[4786]: I1002 06:46:31.749904 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 06:46:31 crc kubenswrapper[4786]: E1002 06:46:31.750218 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 192.168.26.23:6443: connect: connection refused" node="crc" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.193906 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"249e201a7368ce9d1d48889790f03a8613db147b5af37b04d237769fd6c29267"} Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.193989 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.194532 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.194565 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.194575 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.195871 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2ffa594e8f05a960db6e2a76483d1a5bba96a51ae8d28b378b044af07cabcdb3"} Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.195900 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7363758d8415da5f7895b0b5366cf40f12e700913f3587f2d094f212f5a0c055"} Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.195911 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"db07511409f2eb1f1e135e39813df7766557cb3d8c3b62f047fc7bab3ef51f38"} Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.195964 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.196473 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.196493 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.196501 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.197540 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec"} Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.197558 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.197564 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661"} Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.197573 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1"} Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.198031 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.198066 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.198074 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.199973 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e"} Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.200021 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd"} Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.200032 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400"} Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.200041 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948"} Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.200049 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81"} Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.200138 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.200754 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.200781 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.200790 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.201182 4786 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379" exitCode=0 Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.201210 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379"} Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.201285 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.201778 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.201804 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.201812 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:32 crc kubenswrapper[4786]: I1002 06:46:32.708732 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.087087 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.204937 4786 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769" exitCode=0 Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.205009 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.205036 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.205083 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.205115 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.205128 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769"} Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.205172 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.205254 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.205791 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.205822 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.205840 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.208216 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.208248 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.208257 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.208219 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.208292 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.208301 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.208330 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.208354 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.208363 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.283126 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.350279 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.350950 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.350978 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.350988 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:33 crc kubenswrapper[4786]: I1002 06:46:33.351008 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 06:46:34 crc kubenswrapper[4786]: I1002 06:46:34.209929 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0"} Oct 02 06:46:34 crc kubenswrapper[4786]: I1002 06:46:34.209952 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:34 crc kubenswrapper[4786]: I1002 06:46:34.209975 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b"} Oct 02 06:46:34 crc kubenswrapper[4786]: I1002 06:46:34.209990 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13"} Oct 02 06:46:34 crc kubenswrapper[4786]: I1002 06:46:34.209999 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8"} Oct 02 06:46:34 crc kubenswrapper[4786]: I1002 06:46:34.210007 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04"} Oct 02 06:46:34 crc kubenswrapper[4786]: I1002 06:46:34.210019 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:34 crc kubenswrapper[4786]: I1002 06:46:34.210107 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:34 crc kubenswrapper[4786]: I1002 06:46:34.210832 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:34 crc kubenswrapper[4786]: I1002 06:46:34.210859 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:34 crc kubenswrapper[4786]: I1002 06:46:34.210868 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:34 crc kubenswrapper[4786]: I1002 06:46:34.210871 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:34 crc kubenswrapper[4786]: I1002 06:46:34.210888 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:34 crc kubenswrapper[4786]: I1002 06:46:34.210896 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:34 crc kubenswrapper[4786]: I1002 06:46:34.210938 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:34 crc kubenswrapper[4786]: I1002 06:46:34.210962 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:34 crc kubenswrapper[4786]: I1002 06:46:34.210971 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:35 crc kubenswrapper[4786]: I1002 06:46:35.000448 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 06:46:35 crc kubenswrapper[4786]: I1002 06:46:35.003782 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 06:46:35 crc kubenswrapper[4786]: I1002 06:46:35.212123 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:35 crc kubenswrapper[4786]: I1002 06:46:35.212166 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:35 crc kubenswrapper[4786]: I1002 06:46:35.212876 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:35 crc kubenswrapper[4786]: I1002 06:46:35.212894 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:35 crc kubenswrapper[4786]: I1002 06:46:35.212901 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:35 crc kubenswrapper[4786]: I1002 06:46:35.212943 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:35 crc kubenswrapper[4786]: I1002 06:46:35.212920 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:35 crc kubenswrapper[4786]: I1002 06:46:35.212985 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:36 crc kubenswrapper[4786]: I1002 06:46:36.214118 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 06:46:36 crc kubenswrapper[4786]: I1002 06:46:36.214165 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:36 crc kubenswrapper[4786]: I1002 06:46:36.214821 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:36 crc kubenswrapper[4786]: I1002 06:46:36.214851 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:36 crc kubenswrapper[4786]: I1002 06:46:36.214860 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.062137 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.062344 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.063485 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.063519 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.064328 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.216896 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.216992 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.217772 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.217821 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.217832 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.432004 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.432113 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.432987 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.433012 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.433020 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.769873 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.769999 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.770795 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.770822 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.770830 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:39 crc kubenswrapper[4786]: I1002 06:46:39.998989 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 06:46:40 crc kubenswrapper[4786]: I1002 06:46:40.221211 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:40 crc kubenswrapper[4786]: I1002 06:46:40.222283 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:40 crc kubenswrapper[4786]: I1002 06:46:40.222330 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:40 crc kubenswrapper[4786]: I1002 06:46:40.222340 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:40 crc kubenswrapper[4786]: I1002 06:46:40.224071 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 06:46:40 crc kubenswrapper[4786]: E1002 06:46:40.247883 4786 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 06:46:41 crc kubenswrapper[4786]: I1002 06:46:41.204540 4786 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 02 06:46:41 crc kubenswrapper[4786]: I1002 06:46:41.204600 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 02 06:46:41 crc kubenswrapper[4786]: I1002 06:46:41.222590 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:41 crc kubenswrapper[4786]: I1002 06:46:41.223160 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:41 crc kubenswrapper[4786]: I1002 06:46:41.223185 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:41 crc kubenswrapper[4786]: I1002 06:46:41.223193 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:42 crc kubenswrapper[4786]: I1002 06:46:42.139276 4786 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 02 06:46:42 crc kubenswrapper[4786]: I1002 06:46:42.285037 4786 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 02 06:46:42 crc kubenswrapper[4786]: I1002 06:46:42.285083 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 02 06:46:42 crc kubenswrapper[4786]: I1002 06:46:42.291148 4786 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 02 06:46:42 crc kubenswrapper[4786]: I1002 06:46:42.291177 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 02 06:46:42 crc kubenswrapper[4786]: I1002 06:46:42.998987 4786 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 06:46:42 crc kubenswrapper[4786]: I1002 06:46:42.999042 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 06:46:43 crc kubenswrapper[4786]: I1002 06:46:43.292736 4786 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]log ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]etcd ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/generic-apiserver-start-informers ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/priority-and-fairness-filter ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/start-apiextensions-informers ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/start-apiextensions-controllers ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/crd-informer-synced ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/start-system-namespaces-controller ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 02 06:46:43 crc kubenswrapper[4786]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Oct 02 06:46:43 crc kubenswrapper[4786]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/bootstrap-controller ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/start-kube-aggregator-informers ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/apiservice-registration-controller ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/apiservice-discovery-controller ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]autoregister-completion ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/apiservice-openapi-controller ok Oct 02 06:46:43 crc kubenswrapper[4786]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 02 06:46:43 crc kubenswrapper[4786]: livez check failed Oct 02 06:46:43 crc kubenswrapper[4786]: I1002 06:46:43.292981 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:46:47 crc kubenswrapper[4786]: E1002 06:46:47.295042 4786 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Oct 02 06:46:47 crc kubenswrapper[4786]: I1002 06:46:47.296066 4786 trace.go:236] Trace[1594491379]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 06:46:33.152) (total time: 14143ms): Oct 02 06:46:47 crc kubenswrapper[4786]: Trace[1594491379]: ---"Objects listed" error: 14143ms (06:46:47.295) Oct 02 06:46:47 crc kubenswrapper[4786]: Trace[1594491379]: [14.143991715s] [14.143991715s] END Oct 02 06:46:47 crc kubenswrapper[4786]: I1002 06:46:47.296102 4786 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 02 06:46:47 crc kubenswrapper[4786]: I1002 06:46:47.296870 4786 trace.go:236] Trace[570152855]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 06:46:34.039) (total time: 13257ms): Oct 02 06:46:47 crc kubenswrapper[4786]: Trace[570152855]: ---"Objects listed" error: 13257ms (06:46:47.296) Oct 02 06:46:47 crc kubenswrapper[4786]: Trace[570152855]: [13.257099346s] [13.257099346s] END Oct 02 06:46:47 crc kubenswrapper[4786]: I1002 06:46:47.296891 4786 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 02 06:46:47 crc kubenswrapper[4786]: E1002 06:46:47.297264 4786 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 02 06:46:47 crc kubenswrapper[4786]: I1002 06:46:47.297786 4786 trace.go:236] Trace[416279795]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 06:46:32.762) (total time: 14535ms): Oct 02 06:46:47 crc kubenswrapper[4786]: Trace[416279795]: ---"Objects listed" error: 14535ms (06:46:47.297) Oct 02 06:46:47 crc kubenswrapper[4786]: Trace[416279795]: [14.535188395s] [14.535188395s] END Oct 02 06:46:47 crc kubenswrapper[4786]: I1002 06:46:47.297814 4786 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 02 06:46:47 crc kubenswrapper[4786]: I1002 06:46:47.297888 4786 trace.go:236] Trace[406448833]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 06:46:32.841) (total time: 14456ms): Oct 02 06:46:47 crc kubenswrapper[4786]: Trace[406448833]: ---"Objects listed" error: 14456ms (06:46:47.297) Oct 02 06:46:47 crc kubenswrapper[4786]: Trace[406448833]: [14.456700226s] [14.456700226s] END Oct 02 06:46:47 crc kubenswrapper[4786]: I1002 06:46:47.297902 4786 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 02 06:46:47 crc kubenswrapper[4786]: I1002 06:46:47.298489 4786 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.141630 4786 apiserver.go:52] "Watching apiserver" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.144867 4786 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.145179 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-g5lv2","openshift-machine-config-operator/machine-config-daemon-p6dmq","openshift-multus/multus-7hgkl","openshift-multus/multus-additional-cni-plugins-r5k86","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-operator/iptables-alerter-4ln5h"] Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.145481 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.145531 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.145492 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.145646 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.145715 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.145792 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.145800 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.145943 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g5lv2" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.145956 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.146065 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.146071 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.146379 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.146395 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.148740 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.148778 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.149079 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.149137 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.149156 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.149254 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.153129 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.153190 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.153194 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.153143 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.153192 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.153260 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.153260 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.153227 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.155033 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.155173 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.160453 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.160506 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.160615 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.160663 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.160832 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.161147 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.161154 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.161348 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.169764 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.177069 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.183572 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.189491 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.195398 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.201907 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.207504 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.214713 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.220162 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.226178 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.230789 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.238310 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.239650 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e" exitCode=255 Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.239680 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e"} Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.243540 4786 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.245771 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.246022 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.246242 4786 scope.go:117] "RemoveContainer" containerID="b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.254290 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.261958 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.268295 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.276418 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.281161 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.286989 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.287131 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.294964 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.300635 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303567 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303599 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303619 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303635 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303648 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303662 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303706 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303722 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303755 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303771 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303784 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303800 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303813 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303826 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303841 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303856 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303852 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303869 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303884 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303924 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303954 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303969 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303983 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.303998 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304011 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304025 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304051 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304066 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304081 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304095 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304109 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304102 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304124 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304139 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304153 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304167 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304182 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304197 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304213 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304227 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304242 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304256 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304270 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304284 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304298 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304311 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304230 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304325 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304348 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304372 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304392 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304412 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304430 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304448 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304462 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304477 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304492 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304506 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304523 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304541 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304556 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304571 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304587 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304601 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304616 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304630 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304646 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304660 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304674 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304704 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304720 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304734 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304748 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304763 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304781 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304797 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304815 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304830 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304844 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304859 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304874 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304889 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304904 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304918 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304954 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304969 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304985 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305001 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305049 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305065 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305098 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305116 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305132 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305146 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305161 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305177 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305193 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305208 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305223 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305240 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305255 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305270 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305286 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305302 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305317 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305333 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305347 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305363 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305397 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305414 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305429 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305445 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305461 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305476 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305491 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305507 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305522 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305536 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305551 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305572 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305587 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305659 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305677 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305707 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305724 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305740 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305756 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305770 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305785 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305800 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305815 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305831 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306194 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306219 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306236 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306251 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306268 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306285 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306302 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306321 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306337 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306355 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306397 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306416 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306434 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306449 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306486 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306502 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306517 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306533 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306567 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306582 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306597 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306629 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306644 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306678 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306717 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306735 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306751 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306766 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306798 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306815 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306832 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306865 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306882 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306899 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306915 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306947 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306964 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306979 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306994 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307032 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307058 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307073 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307107 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307125 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307140 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307157 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307192 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307210 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307230 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307265 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307282 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307299 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307324 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307341 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307356 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307373 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307389 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307405 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307420 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307435 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307451 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307469 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307484 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307500 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307515 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307530 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307549 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307614 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-host-var-lib-kubelet\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307641 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307657 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b625fb23-ba7e-4931-b753-94dc23e8effa-system-cni-dir\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307675 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdwpx\" (UniqueName: \"kubernetes.io/projected/b625fb23-ba7e-4931-b753-94dc23e8effa-kube-api-access-tdwpx\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307706 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/35f634be-a80e-4770-a408-a258fd303dee-hosts-file\") pod \"node-resolver-g5lv2\" (UID: \"35f634be-a80e-4770-a408-a258fd303dee\") " pod="openshift-dns/node-resolver-g5lv2" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307722 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-cni-binary-copy\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307738 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx8n4\" (UniqueName: \"kubernetes.io/projected/35f634be-a80e-4770-a408-a258fd303dee-kube-api-access-sx8n4\") pod \"node-resolver-g5lv2\" (UID: \"35f634be-a80e-4770-a408-a258fd303dee\") " pod="openshift-dns/node-resolver-g5lv2" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307779 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307800 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307816 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-system-cni-dir\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307831 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-host-run-netns\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307847 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307880 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-os-release\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307900 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307916 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b625fb23-ba7e-4931-b753-94dc23e8effa-os-release\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307931 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-multus-daemon-config\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307962 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-host-run-multus-certs\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307981 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b625fb23-ba7e-4931-b753-94dc23e8effa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307998 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/79cb22df-4930-4aed-9108-1056074d1000-mcd-auth-proxy-config\") pod \"machine-config-daemon-p6dmq\" (UID: \"79cb22df-4930-4aed-9108-1056074d1000\") " pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308014 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfbjt\" (UniqueName: \"kubernetes.io/projected/79cb22df-4930-4aed-9108-1056074d1000-kube-api-access-gfbjt\") pod \"machine-config-daemon-p6dmq\" (UID: \"79cb22df-4930-4aed-9108-1056074d1000\") " pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308030 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308053 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b625fb23-ba7e-4931-b753-94dc23e8effa-cnibin\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308087 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308106 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308123 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308138 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-multus-cni-dir\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308152 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-host-run-k8s-cni-cncf-io\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308167 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-host-var-lib-cni-multus\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308180 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-hostroot\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308199 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308214 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79cb22df-4930-4aed-9108-1056074d1000-proxy-tls\") pod \"machine-config-daemon-p6dmq\" (UID: \"79cb22df-4930-4aed-9108-1056074d1000\") " pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308228 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-multus-socket-dir-parent\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308242 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-multus-conf-dir\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308257 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308273 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b625fb23-ba7e-4931-b753-94dc23e8effa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308288 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-etc-kubernetes\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308304 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6gtj\" (UniqueName: \"kubernetes.io/projected/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-kube-api-access-z6gtj\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308320 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/79cb22df-4930-4aed-9108-1056074d1000-rootfs\") pod \"machine-config-daemon-p6dmq\" (UID: \"79cb22df-4930-4aed-9108-1056074d1000\") " pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308333 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-host-var-lib-cni-bin\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304469 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308351 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304497 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304578 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304622 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304823 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.304737 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305149 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305154 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305176 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305234 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305450 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305450 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305480 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305531 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.305923 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306077 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306115 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306163 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306246 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308525 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306367 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306421 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306426 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306684 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306723 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.306973 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307132 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307408 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307430 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307659 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307671 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307817 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307875 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.307974 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308680 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308745 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308370 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308871 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308880 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b625fb23-ba7e-4931-b753-94dc23e8effa-cni-binary-copy\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308896 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308099 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308094 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308213 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308272 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308284 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308303 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308603 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.308619 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309072 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-cnibin\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309117 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309212 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309227 4786 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309239 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309252 4786 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309253 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309263 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309274 4786 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309286 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309296 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309305 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309314 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309325 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309334 4786 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309345 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309354 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309364 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309373 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309382 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309391 4786 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309401 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309411 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309419 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309428 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309437 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309446 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309455 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.309460 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309466 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.309512 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 06:46:48.809496558 +0000 UTC m=+18.930679690 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309533 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309564 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309580 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309738 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309863 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.310014 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.310078 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.310170 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.310248 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.310277 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 06:46:48.810269324 +0000 UTC m=+18.931452456 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.310291 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.310324 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.310345 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.310361 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.310486 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.310547 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.310740 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.310761 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.310781 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.310935 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.312110 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.312204 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.312371 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.312388 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.312426 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.312852 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.312867 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.312989 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.313176 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.313193 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.313159 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.313232 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.313433 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.313475 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.313730 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.313772 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.313792 4786 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.313897 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.314012 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.314241 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.314246 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.314281 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.314348 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.316700 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.317220 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.317555 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.317765 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.309463 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.317920 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.317934 4786 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.317944 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.317953 4786 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.317963 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.317972 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.317982 4786 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.317991 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318000 4786 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318008 4786 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318016 4786 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318024 4786 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318046 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318055 4786 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318063 4786 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318072 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318080 4786 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318089 4786 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318098 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318107 4786 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318116 4786 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318125 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318133 4786 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318143 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318415 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318577 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318798 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.318941 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.319087 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.319274 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.319373 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.319381 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.319469 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:46:48.819456349 +0000 UTC m=+18.940639480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.320006 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.320195 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.320235 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.320490 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.320541 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.320720 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.320741 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.320753 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.320794 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 06:46:48.820782482 +0000 UTC m=+18.941965613 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.320951 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.321072 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.321277 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.321640 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.321819 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.321866 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.322048 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.322116 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.322198 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.322405 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.322463 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.322583 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.322806 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.322938 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.322883 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.323222 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.323321 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.323439 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.323462 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.323509 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.323669 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.323873 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.323983 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.324022 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.324137 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.324478 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.324498 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.324528 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.324596 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.324904 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.324996 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.325241 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.325352 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.325386 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.325425 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.325447 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.325466 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.325506 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.325643 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.325716 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.325759 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.325972 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.325990 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.326089 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.326125 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.326158 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.326307 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.326396 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.326442 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.326560 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.326563 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.326787 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.327600 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.328201 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.328403 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.328518 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.328564 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.328608 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.328623 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.328612 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.328633 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.328776 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 06:46:48.828745666 +0000 UTC m=+18.949928797 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.331572 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.334421 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.334534 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.335078 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.335136 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.335609 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.335641 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.335681 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.335683 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.335768 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.335800 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.335856 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.335875 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.335938 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.335971 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.336151 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.337485 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.338316 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.338460 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.338583 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.338682 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.338735 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.339093 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.339567 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.341070 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.341130 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.341226 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.341280 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.341675 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.342535 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.342794 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.342838 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.342861 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.342915 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.343472 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.351421 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.356427 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.357243 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.360894 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.364105 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.365802 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.366668 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.369088 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.377998 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.384831 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.391196 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.399670 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419057 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/79cb22df-4930-4aed-9108-1056074d1000-rootfs\") pod \"machine-config-daemon-p6dmq\" (UID: \"79cb22df-4930-4aed-9108-1056074d1000\") " pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419089 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-host-var-lib-cni-bin\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419114 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b625fb23-ba7e-4931-b753-94dc23e8effa-cni-binary-copy\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419149 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-cnibin\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419153 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-host-var-lib-cni-bin\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419170 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b625fb23-ba7e-4931-b753-94dc23e8effa-system-cni-dir\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419128 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/79cb22df-4930-4aed-9108-1056074d1000-rootfs\") pod \"machine-config-daemon-p6dmq\" (UID: \"79cb22df-4930-4aed-9108-1056074d1000\") " pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419185 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdwpx\" (UniqueName: \"kubernetes.io/projected/b625fb23-ba7e-4931-b753-94dc23e8effa-kube-api-access-tdwpx\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419207 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-cnibin\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419219 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/35f634be-a80e-4770-a408-a258fd303dee-hosts-file\") pod \"node-resolver-g5lv2\" (UID: \"35f634be-a80e-4770-a408-a258fd303dee\") " pod="openshift-dns/node-resolver-g5lv2" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419225 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b625fb23-ba7e-4931-b753-94dc23e8effa-system-cni-dir\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419233 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-cni-binary-copy\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419297 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/35f634be-a80e-4770-a408-a258fd303dee-hosts-file\") pod \"node-resolver-g5lv2\" (UID: \"35f634be-a80e-4770-a408-a258fd303dee\") " pod="openshift-dns/node-resolver-g5lv2" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419324 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-host-var-lib-kubelet\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419352 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx8n4\" (UniqueName: \"kubernetes.io/projected/35f634be-a80e-4770-a408-a258fd303dee-kube-api-access-sx8n4\") pod \"node-resolver-g5lv2\" (UID: \"35f634be-a80e-4770-a408-a258fd303dee\") " pod="openshift-dns/node-resolver-g5lv2" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419374 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-system-cni-dir\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419385 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-host-var-lib-kubelet\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419390 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-host-run-netns\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419413 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-host-run-netns\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419416 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-os-release\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419461 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b625fb23-ba7e-4931-b753-94dc23e8effa-os-release\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419478 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-multus-daemon-config\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419494 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-host-run-multus-certs\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419517 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-os-release\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419515 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b625fb23-ba7e-4931-b753-94dc23e8effa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419549 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/79cb22df-4930-4aed-9108-1056074d1000-mcd-auth-proxy-config\") pod \"machine-config-daemon-p6dmq\" (UID: \"79cb22df-4930-4aed-9108-1056074d1000\") " pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419582 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfbjt\" (UniqueName: \"kubernetes.io/projected/79cb22df-4930-4aed-9108-1056074d1000-kube-api-access-gfbjt\") pod \"machine-config-daemon-p6dmq\" (UID: \"79cb22df-4930-4aed-9108-1056074d1000\") " pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419598 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419611 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b625fb23-ba7e-4931-b753-94dc23e8effa-cnibin\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419632 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-multus-cni-dir\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419657 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-host-run-k8s-cni-cncf-io\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419670 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-host-var-lib-cni-multus\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419685 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-hostroot\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419724 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79cb22df-4930-4aed-9108-1056074d1000-proxy-tls\") pod \"machine-config-daemon-p6dmq\" (UID: \"79cb22df-4930-4aed-9108-1056074d1000\") " pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419737 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-multus-socket-dir-parent\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419751 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-multus-conf-dir\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419765 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419773 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-cni-binary-copy\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419794 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b625fb23-ba7e-4931-b753-94dc23e8effa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419811 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-etc-kubernetes\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419825 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6gtj\" (UniqueName: \"kubernetes.io/projected/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-kube-api-access-z6gtj\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419832 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b625fb23-ba7e-4931-b753-94dc23e8effa-os-release\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419906 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b625fb23-ba7e-4931-b753-94dc23e8effa-cnibin\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419908 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-hostroot\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.419992 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-multus-cni-dir\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420012 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-host-run-k8s-cni-cncf-io\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420116 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b625fb23-ba7e-4931-b753-94dc23e8effa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420154 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420183 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-multus-socket-dir-parent\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420197 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-multus-conf-dir\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420210 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420260 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-host-var-lib-cni-multus\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420397 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-etc-kubernetes\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420443 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-system-cni-dir\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420455 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-host-run-multus-certs\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420539 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-multus-daemon-config\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420550 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420570 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420635 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420647 4786 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420656 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420666 4786 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420676 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420685 4786 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420710 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420718 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420726 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420734 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420743 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420750 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420758 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420766 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420773 4786 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420781 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420788 4786 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420796 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420804 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420813 4786 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420821 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420828 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420836 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420844 4786 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420851 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420859 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420868 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420875 4786 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420883 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420891 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420899 4786 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420907 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420915 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420923 4786 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420930 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420932 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/79cb22df-4930-4aed-9108-1056074d1000-mcd-auth-proxy-config\") pod \"machine-config-daemon-p6dmq\" (UID: \"79cb22df-4930-4aed-9108-1056074d1000\") " pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420938 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420975 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420985 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.420994 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421002 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421011 4786 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421020 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421029 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421048 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421055 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421054 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b625fb23-ba7e-4931-b753-94dc23e8effa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421067 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421095 4786 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421106 4786 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421116 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421124 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421133 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421140 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421149 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421156 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421164 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421172 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421181 4786 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421189 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421197 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421205 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421212 4786 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421220 4786 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421227 4786 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421235 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421242 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421250 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421257 4786 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421265 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421273 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421281 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421291 4786 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421298 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421306 4786 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421314 4786 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421321 4786 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421329 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421336 4786 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421343 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421351 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421359 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421367 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421375 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421382 4786 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421389 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421396 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421404 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421411 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421419 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421427 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421435 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421443 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421451 4786 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421458 4786 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421465 4786 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421472 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421480 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421487 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421494 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421501 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421510 4786 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421517 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421524 4786 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421531 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421540 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421547 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421554 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421562 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421569 4786 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421576 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421584 4786 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421591 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421600 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421607 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421614 4786 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421621 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421640 4786 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.421649 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.422869 4786 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.422881 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423089 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423106 4786 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423160 4786 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423170 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423177 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423185 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423192 4786 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423262 4786 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423313 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423322 4786 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423330 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423338 4786 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423346 4786 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423337 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79cb22df-4930-4aed-9108-1056074d1000-proxy-tls\") pod \"machine-config-daemon-p6dmq\" (UID: \"79cb22df-4930-4aed-9108-1056074d1000\") " pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423372 4786 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423409 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423418 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423452 4786 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423462 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423470 4786 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423477 4786 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423485 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423492 4786 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.423500 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.422475 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b625fb23-ba7e-4931-b753-94dc23e8effa-cni-binary-copy\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.431758 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdwpx\" (UniqueName: \"kubernetes.io/projected/b625fb23-ba7e-4931-b753-94dc23e8effa-kube-api-access-tdwpx\") pod \"multus-additional-cni-plugins-r5k86\" (UID: \"b625fb23-ba7e-4931-b753-94dc23e8effa\") " pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.432801 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfbjt\" (UniqueName: \"kubernetes.io/projected/79cb22df-4930-4aed-9108-1056074d1000-kube-api-access-gfbjt\") pod \"machine-config-daemon-p6dmq\" (UID: \"79cb22df-4930-4aed-9108-1056074d1000\") " pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.433256 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx8n4\" (UniqueName: \"kubernetes.io/projected/35f634be-a80e-4770-a408-a258fd303dee-kube-api-access-sx8n4\") pod \"node-resolver-g5lv2\" (UID: \"35f634be-a80e-4770-a408-a258fd303dee\") " pod="openshift-dns/node-resolver-g5lv2" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.435298 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6gtj\" (UniqueName: \"kubernetes.io/projected/de8dcd53-84d9-422e-8f18-63ea8ea75bd2-kube-api-access-z6gtj\") pod \"multus-7hgkl\" (UID: \"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\") " pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.454369 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 06:46:48 crc kubenswrapper[4786]: W1002 06:46:48.461993 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-d0953661bad19b534788ffc9fa281128d3d39bc33d8dc390706f8ea83e46b443 WatchSource:0}: Error finding container d0953661bad19b534788ffc9fa281128d3d39bc33d8dc390706f8ea83e46b443: Status 404 returned error can't find the container with id d0953661bad19b534788ffc9fa281128d3d39bc33d8dc390706f8ea83e46b443 Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.464677 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.469997 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.475084 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g5lv2" Oct 02 06:46:48 crc kubenswrapper[4786]: W1002 06:46:48.477492 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-b92ec6d03c91f533d77bd1c52b5fc89432debcb53470b133a7001cac5977428c WatchSource:0}: Error finding container b92ec6d03c91f533d77bd1c52b5fc89432debcb53470b133a7001cac5977428c: Status 404 returned error can't find the container with id b92ec6d03c91f533d77bd1c52b5fc89432debcb53470b133a7001cac5977428c Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.482137 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.485569 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7hgkl" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.486887 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bgs8z"] Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.487565 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.489970 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.490284 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.490475 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-r5k86" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.490556 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.490413 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.490772 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.491448 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.491827 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 02 06:46:48 crc kubenswrapper[4786]: W1002 06:46:48.498946 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79cb22df_4930_4aed_9108_1056074d1000.slice/crio-77cd4e1802b7c756ad949ab1b48a2b7717eb6640c109774e46167c74397f4672 WatchSource:0}: Error finding container 77cd4e1802b7c756ad949ab1b48a2b7717eb6640c109774e46167c74397f4672: Status 404 returned error can't find the container with id 77cd4e1802b7c756ad949ab1b48a2b7717eb6640c109774e46167c74397f4672 Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.499294 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.506570 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.513416 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.523330 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.530023 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.546187 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.584965 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624381 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-kubelet\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624402 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-run-systemd\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624417 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/894eab78-90cf-4975-aa45-223332e04f5c-env-overrides\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624430 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-node-log\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624442 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-log-socket\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624457 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-systemd-units\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624470 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/894eab78-90cf-4975-aa45-223332e04f5c-ovn-node-metrics-cert\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624485 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsb84\" (UniqueName: \"kubernetes.io/projected/894eab78-90cf-4975-aa45-223332e04f5c-kube-api-access-nsb84\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624501 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-slash\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624536 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-run-ovn\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624550 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/894eab78-90cf-4975-aa45-223332e04f5c-ovnkube-script-lib\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624563 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-var-lib-openvswitch\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624576 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-etc-openvswitch\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624590 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-run-ovn-kubernetes\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624604 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-run-netns\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624616 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-cni-netd\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624631 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624650 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-cni-bin\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624662 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/894eab78-90cf-4975-aa45-223332e04f5c-ovnkube-config\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.624683 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-run-openvswitch\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.626295 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.665940 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.705039 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.725861 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-run-ovn\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.725892 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/894eab78-90cf-4975-aa45-223332e04f5c-ovnkube-script-lib\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.725908 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-var-lib-openvswitch\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.725922 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-etc-openvswitch\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.725936 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-run-ovn-kubernetes\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.725949 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-run-netns\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.725961 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-cni-netd\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.725977 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.725998 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-cni-bin\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726012 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/894eab78-90cf-4975-aa45-223332e04f5c-ovnkube-config\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726047 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-run-openvswitch\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726059 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-kubelet\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726074 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-run-systemd\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726090 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-node-log\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726101 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-log-socket\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726113 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/894eab78-90cf-4975-aa45-223332e04f5c-env-overrides\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726125 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-systemd-units\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726138 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/894eab78-90cf-4975-aa45-223332e04f5c-ovn-node-metrics-cert\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726152 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsb84\" (UniqueName: \"kubernetes.io/projected/894eab78-90cf-4975-aa45-223332e04f5c-kube-api-access-nsb84\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726172 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-slash\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726222 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-slash\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726262 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-var-lib-openvswitch\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726283 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-etc-openvswitch\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726301 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-run-ovn-kubernetes\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726319 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-run-netns\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726338 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-cni-netd\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726358 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726376 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-cni-bin\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726483 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/894eab78-90cf-4975-aa45-223332e04f5c-ovnkube-script-lib\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726543 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-run-ovn\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726571 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-log-socket\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726591 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-run-openvswitch\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726615 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-kubelet\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726635 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-run-systemd\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726653 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-node-log\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726674 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-systemd-units\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726813 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/894eab78-90cf-4975-aa45-223332e04f5c-ovnkube-config\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.726990 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/894eab78-90cf-4975-aa45-223332e04f5c-env-overrides\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.729443 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/894eab78-90cf-4975-aa45-223332e04f5c-ovn-node-metrics-cert\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.748243 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.776160 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsb84\" (UniqueName: \"kubernetes.io/projected/894eab78-90cf-4975-aa45-223332e04f5c-kube-api-access-nsb84\") pod \"ovnkube-node-bgs8z\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.800129 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:48 crc kubenswrapper[4786]: W1002 06:46:48.810146 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod894eab78_90cf_4975_aa45_223332e04f5c.slice/crio-d5f32b54be0b5c8f03f781c6e922a4e9c3f5a50647be38cb9f7e6ae3c98cd11a WatchSource:0}: Error finding container d5f32b54be0b5c8f03f781c6e922a4e9c3f5a50647be38cb9f7e6ae3c98cd11a: Status 404 returned error can't find the container with id d5f32b54be0b5c8f03f781c6e922a4e9c3f5a50647be38cb9f7e6ae3c98cd11a Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.811608 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.826950 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.827061 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.827101 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:46:49.827078152 +0000 UTC m=+19.948261283 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.827139 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.827202 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.827141 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.827266 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.827298 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 06:46:49.827287013 +0000 UTC m=+19.948470143 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.827185 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.827315 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 06:46:49.827309505 +0000 UTC m=+19.948492636 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.827323 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.827334 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.827359 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 06:46:49.827350702 +0000 UTC m=+19.948533833 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:48 crc kubenswrapper[4786]: I1002 06:46:48.927987 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.928116 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.928138 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.928148 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:48 crc kubenswrapper[4786]: E1002 06:46:48.928193 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 06:46:49.928180837 +0000 UTC m=+20.049363968 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.243575 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.245061 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9"} Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.245736 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.248968 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7hgkl" event={"ID":"de8dcd53-84d9-422e-8f18-63ea8ea75bd2","Type":"ContainerStarted","Data":"5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015"} Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.248994 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7hgkl" event={"ID":"de8dcd53-84d9-422e-8f18-63ea8ea75bd2","Type":"ContainerStarted","Data":"b3f41a12db5663bf8a508de104ef22e2b7bc8672d8dfc38ababf843512533325"} Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.249214 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.250004 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63"} Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.250039 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d0953661bad19b534788ffc9fa281128d3d39bc33d8dc390706f8ea83e46b443"} Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.250751 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b92ec6d03c91f533d77bd1c52b5fc89432debcb53470b133a7001cac5977428c"} Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.251513 4786 generic.go:334] "Generic (PLEG): container finished" podID="894eab78-90cf-4975-aa45-223332e04f5c" containerID="0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a" exitCode=0 Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.251548 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerDied","Data":"0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a"} Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.251562 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerStarted","Data":"d5f32b54be0b5c8f03f781c6e922a4e9c3f5a50647be38cb9f7e6ae3c98cd11a"} Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.253727 4786 generic.go:334] "Generic (PLEG): container finished" podID="b625fb23-ba7e-4931-b753-94dc23e8effa" containerID="e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b" exitCode=0 Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.253779 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" event={"ID":"b625fb23-ba7e-4931-b753-94dc23e8effa","Type":"ContainerDied","Data":"e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b"} Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.253797 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" event={"ID":"b625fb23-ba7e-4931-b753-94dc23e8effa","Type":"ContainerStarted","Data":"67453b419b11c9c8aa375324f3f4374e63ecbf6cfede6b751eb21436a74b40ca"} Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.257429 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.259064 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerStarted","Data":"08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c"} Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.259086 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerStarted","Data":"41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e"} Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.259096 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerStarted","Data":"77cd4e1802b7c756ad949ab1b48a2b7717eb6640c109774e46167c74397f4672"} Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.260703 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g5lv2" event={"ID":"35f634be-a80e-4770-a408-a258fd303dee","Type":"ContainerStarted","Data":"72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb"} Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.260727 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g5lv2" event={"ID":"35f634be-a80e-4770-a408-a258fd303dee","Type":"ContainerStarted","Data":"da6181dbd65a43a1d38abe06fc88c1dea7b1fcdca4fd6e8c3739a44b46f1a3ff"} Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.263771 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd"} Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.263814 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b"} Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.263824 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e2ad10a3ff21388ad9dffd5bff4ed46a43a514ba4ecf7ee5a3ffdb87ee85409d"} Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.275860 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.284296 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.295595 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.306349 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.315845 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.329177 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.346824 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.355512 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.363241 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.374873 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.382237 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.389804 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.405932 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.413426 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.449817 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.486782 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.529413 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.568955 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.607679 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.653927 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.686038 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.733339 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.772899 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.796408 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.806784 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.809325 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.827414 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.836855 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.836950 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:46:49 crc kubenswrapper[4786]: E1002 06:46:49.837001 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:46:51.836982884 +0000 UTC m=+21.958166016 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:46:49 crc kubenswrapper[4786]: E1002 06:46:49.837030 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.837069 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:46:49 crc kubenswrapper[4786]: E1002 06:46:49.837072 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 06:46:51.837062304 +0000 UTC m=+21.958245435 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.837118 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:46:49 crc kubenswrapper[4786]: E1002 06:46:49.837181 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 06:46:49 crc kubenswrapper[4786]: E1002 06:46:49.837216 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 06:46:51.837209118 +0000 UTC m=+21.958392249 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 06:46:49 crc kubenswrapper[4786]: E1002 06:46:49.837244 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 06:46:49 crc kubenswrapper[4786]: E1002 06:46:49.837263 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 06:46:49 crc kubenswrapper[4786]: E1002 06:46:49.837274 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:49 crc kubenswrapper[4786]: E1002 06:46:49.837312 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 06:46:51.837300049 +0000 UTC m=+21.958483179 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.868115 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.907232 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.938431 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:46:49 crc kubenswrapper[4786]: E1002 06:46:49.938531 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 06:46:49 crc kubenswrapper[4786]: E1002 06:46:49.938549 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 06:46:49 crc kubenswrapper[4786]: E1002 06:46:49.938560 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:49 crc kubenswrapper[4786]: E1002 06:46:49.938597 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 06:46:51.938587079 +0000 UTC m=+22.059770210 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.947499 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:49 crc kubenswrapper[4786]: I1002 06:46:49.986958 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.001839 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.004515 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.025332 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.046462 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.088123 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.126519 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.168435 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.178278 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.178305 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.178338 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:46:50 crc kubenswrapper[4786]: E1002 06:46:50.178371 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:46:50 crc kubenswrapper[4786]: E1002 06:46:50.178438 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:46:50 crc kubenswrapper[4786]: E1002 06:46:50.178518 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.181528 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.182221 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.183275 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.183884 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.185954 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.186458 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.187000 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.187544 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.188138 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.188577 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.189078 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.189668 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.190148 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.190595 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.191075 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.191527 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.192037 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.192409 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.192961 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.193472 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.195027 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.195642 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.196101 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.197116 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.197539 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.198489 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.199135 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.199936 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.200475 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.201220 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.201858 4786 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.201964 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.203186 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.203714 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.204153 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.205208 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.205812 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.206318 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.207133 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.207200 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.207730 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.208376 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.208912 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.209476 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.212400 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.212828 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.213618 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.214124 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.215145 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.215591 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.216363 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.216791 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.217576 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.218086 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.218494 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.247783 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.266059 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5"} Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.267556 4786 generic.go:334] "Generic (PLEG): container finished" podID="b625fb23-ba7e-4931-b753-94dc23e8effa" containerID="94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d" exitCode=0 Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.267607 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" event={"ID":"b625fb23-ba7e-4931-b753-94dc23e8effa","Type":"ContainerDied","Data":"94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d"} Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.271254 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerStarted","Data":"ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8"} Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.271310 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerStarted","Data":"c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc"} Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.271322 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerStarted","Data":"d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8"} Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.271331 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerStarted","Data":"0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872"} Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.271340 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerStarted","Data":"3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f"} Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.271348 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerStarted","Data":"696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01"} Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.291616 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.327320 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.367122 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.407095 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.448021 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.487744 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.497640 4786 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.501059 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.501420 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.501485 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.501591 4786 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.546059 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.561972 4786 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.562128 4786 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.562816 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.562841 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.562850 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.562862 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.562871 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:50Z","lastTransitionTime":"2025-10-02T06:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:50 crc kubenswrapper[4786]: E1002 06:46:50.575213 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.577243 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.577269 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.577278 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.577288 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.577296 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:50Z","lastTransitionTime":"2025-10-02T06:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:50 crc kubenswrapper[4786]: E1002 06:46:50.585376 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.588096 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.588199 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.588255 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.588306 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.588423 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:50Z","lastTransitionTime":"2025-10-02T06:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:50 crc kubenswrapper[4786]: E1002 06:46:50.596113 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.598069 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.598151 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.598212 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.598288 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.598348 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:50Z","lastTransitionTime":"2025-10-02T06:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:50 crc kubenswrapper[4786]: E1002 06:46:50.605848 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.607592 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.608365 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.608390 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.608399 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.608408 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.608416 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:50Z","lastTransitionTime":"2025-10-02T06:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:50 crc kubenswrapper[4786]: E1002 06:46:50.616034 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: E1002 06:46:50.616131 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.617174 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.617203 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.617212 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.617224 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.617232 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:50Z","lastTransitionTime":"2025-10-02T06:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.645356 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.688744 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.718527 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.718550 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.718560 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.718571 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.718579 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:50Z","lastTransitionTime":"2025-10-02T06:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.726977 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.766891 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.812105 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.820183 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.820216 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.820224 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.820236 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.820247 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:50Z","lastTransitionTime":"2025-10-02T06:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.847667 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.891884 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.921425 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.921453 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.921462 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.921474 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.921483 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:50Z","lastTransitionTime":"2025-10-02T06:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.926764 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:50 crc kubenswrapper[4786]: I1002 06:46:50.966996 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.008205 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.023527 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.023556 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.023564 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.023577 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.023586 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:51Z","lastTransitionTime":"2025-10-02T06:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.048685 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.089678 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.128245 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.128338 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.128432 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.128499 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.128683 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:51Z","lastTransitionTime":"2025-10-02T06:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.128645 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.167814 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.212056 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.231041 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.231073 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.231082 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.231092 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.231100 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:51Z","lastTransitionTime":"2025-10-02T06:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.247435 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.275134 4786 generic.go:334] "Generic (PLEG): container finished" podID="b625fb23-ba7e-4931-b753-94dc23e8effa" containerID="432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53" exitCode=0 Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.275207 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" event={"ID":"b625fb23-ba7e-4931-b753-94dc23e8effa","Type":"ContainerDied","Data":"432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53"} Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.291416 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.329261 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.332464 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.332490 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.332500 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.332509 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.332519 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:51Z","lastTransitionTime":"2025-10-02T06:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.367617 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.408026 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.433815 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.433841 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.433850 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.433861 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.433869 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:51Z","lastTransitionTime":"2025-10-02T06:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.449795 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.487276 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.526899 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.535074 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.535100 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.535108 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.535120 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.535128 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:51Z","lastTransitionTime":"2025-10-02T06:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.571971 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.607197 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.637511 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.637544 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.637554 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.637565 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.637574 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:51Z","lastTransitionTime":"2025-10-02T06:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.650685 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.686880 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.727449 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.739669 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.739716 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.739725 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.739736 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.739745 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:51Z","lastTransitionTime":"2025-10-02T06:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.766550 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.808888 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.841405 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.841432 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.841442 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.841454 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.841463 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:51Z","lastTransitionTime":"2025-10-02T06:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.847136 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.852365 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.852431 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.852472 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:46:51 crc kubenswrapper[4786]: E1002 06:46:51.852514 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:46:55.85249618 +0000 UTC m=+25.973679312 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:46:51 crc kubenswrapper[4786]: E1002 06:46:51.852519 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.852550 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:46:51 crc kubenswrapper[4786]: E1002 06:46:51.852562 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 06:46:55.85255509 +0000 UTC m=+25.973738221 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 06:46:51 crc kubenswrapper[4786]: E1002 06:46:51.852612 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 06:46:51 crc kubenswrapper[4786]: E1002 06:46:51.852616 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 06:46:51 crc kubenswrapper[4786]: E1002 06:46:51.852634 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 06:46:51 crc kubenswrapper[4786]: E1002 06:46:51.852645 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:51 crc kubenswrapper[4786]: E1002 06:46:51.852647 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 06:46:55.852636002 +0000 UTC m=+25.973819133 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 06:46:51 crc kubenswrapper[4786]: E1002 06:46:51.852705 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 06:46:55.852681407 +0000 UTC m=+25.973864538 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.888323 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.927617 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.943065 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.943101 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.943110 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.943123 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.943133 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:51Z","lastTransitionTime":"2025-10-02T06:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.953468 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:46:51 crc kubenswrapper[4786]: E1002 06:46:51.953633 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 06:46:51 crc kubenswrapper[4786]: E1002 06:46:51.953658 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 06:46:51 crc kubenswrapper[4786]: E1002 06:46:51.953668 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:51 crc kubenswrapper[4786]: E1002 06:46:51.953733 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 06:46:55.953718219 +0000 UTC m=+26.074901350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:51 crc kubenswrapper[4786]: I1002 06:46:51.966862 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.008367 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:52Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.044607 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.044637 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.044646 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.044658 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.044666 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:52Z","lastTransitionTime":"2025-10-02T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.146551 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.146586 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.146594 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.146608 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.146616 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:52Z","lastTransitionTime":"2025-10-02T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.179272 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.179329 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.179292 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:46:52 crc kubenswrapper[4786]: E1002 06:46:52.179394 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:46:52 crc kubenswrapper[4786]: E1002 06:46:52.179509 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:46:52 crc kubenswrapper[4786]: E1002 06:46:52.179586 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.247896 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.247925 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.247935 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.247948 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.247959 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:52Z","lastTransitionTime":"2025-10-02T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.279817 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerStarted","Data":"b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256"} Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.282163 4786 generic.go:334] "Generic (PLEG): container finished" podID="b625fb23-ba7e-4931-b753-94dc23e8effa" containerID="67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980" exitCode=0 Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.282193 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" event={"ID":"b625fb23-ba7e-4931-b753-94dc23e8effa","Type":"ContainerDied","Data":"67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980"} Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.291398 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:52Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.300324 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:52Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.308516 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:52Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.318093 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:52Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.329730 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:52Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.338875 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:52Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.346572 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:52Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.349902 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.349929 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.349938 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.349950 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.349958 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:52Z","lastTransitionTime":"2025-10-02T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.355641 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:52Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.365782 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:52Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.411160 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:52Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.446529 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:52Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.451542 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.451565 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.451575 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.451588 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.451596 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:52Z","lastTransitionTime":"2025-10-02T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.487630 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:52Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.526970 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:52Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.553445 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.553461 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.553469 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.553480 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.553488 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:52Z","lastTransitionTime":"2025-10-02T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.571438 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:52Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.655002 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.655038 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.655048 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.655063 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.655073 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:52Z","lastTransitionTime":"2025-10-02T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.763645 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.763669 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.763683 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.763715 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.763724 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:52Z","lastTransitionTime":"2025-10-02T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.866032 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.866054 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.866063 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.866073 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.866081 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:52Z","lastTransitionTime":"2025-10-02T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.968000 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.968021 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.968029 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.968039 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:52 crc kubenswrapper[4786]: I1002 06:46:52.968047 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:52Z","lastTransitionTime":"2025-10-02T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.070076 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.070097 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.070104 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.070115 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.070122 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:53Z","lastTransitionTime":"2025-10-02T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.172145 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.172165 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.172172 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.172182 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.172189 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:53Z","lastTransitionTime":"2025-10-02T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.273344 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.273409 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.273422 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.273435 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.273445 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:53Z","lastTransitionTime":"2025-10-02T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.286273 4786 generic.go:334] "Generic (PLEG): container finished" podID="b625fb23-ba7e-4931-b753-94dc23e8effa" containerID="164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77" exitCode=0 Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.286311 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" event={"ID":"b625fb23-ba7e-4931-b753-94dc23e8effa","Type":"ContainerDied","Data":"164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77"} Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.295035 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:53Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.305325 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:53Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.314414 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:53Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.322794 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:53Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.330939 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:53Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.344114 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:53Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.352523 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:53Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.362180 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:53Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.370739 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:53Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.375280 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.375307 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.375316 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.375334 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.375343 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:53Z","lastTransitionTime":"2025-10-02T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.384028 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:53Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.393602 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:53Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.402013 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:53Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.410561 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:53Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.418491 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:53Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.476792 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.476822 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.476831 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.476844 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.476852 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:53Z","lastTransitionTime":"2025-10-02T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.578327 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.578354 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.578362 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.578374 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.578382 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:53Z","lastTransitionTime":"2025-10-02T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.680217 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.680244 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.680252 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.680263 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.680271 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:53Z","lastTransitionTime":"2025-10-02T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.782122 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.782151 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.782160 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.782172 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.782180 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:53Z","lastTransitionTime":"2025-10-02T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.884024 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.884056 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.884064 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.884076 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.884082 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:53Z","lastTransitionTime":"2025-10-02T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.985986 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.986020 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.986030 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.986043 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:53 crc kubenswrapper[4786]: I1002 06:46:53.986051 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:53Z","lastTransitionTime":"2025-10-02T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.087879 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.087910 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.087919 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.087932 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.087942 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:54Z","lastTransitionTime":"2025-10-02T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.178793 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.178825 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:46:54 crc kubenswrapper[4786]: E1002 06:46:54.178881 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.178793 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:46:54 crc kubenswrapper[4786]: E1002 06:46:54.178980 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:46:54 crc kubenswrapper[4786]: E1002 06:46:54.179028 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.189291 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.189310 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.189318 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.189328 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.189337 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:54Z","lastTransitionTime":"2025-10-02T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.290399 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.290596 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.290607 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.290619 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.290629 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:54Z","lastTransitionTime":"2025-10-02T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.292052 4786 generic.go:334] "Generic (PLEG): container finished" podID="b625fb23-ba7e-4931-b753-94dc23e8effa" containerID="eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514" exitCode=0 Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.292116 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" event={"ID":"b625fb23-ba7e-4931-b753-94dc23e8effa","Type":"ContainerDied","Data":"eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514"} Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.296722 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerStarted","Data":"7d4820440d78584ece6e739d7c53f4bd2f4bb602d3c5c269863051ead4f7ee09"} Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.297221 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.303200 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.335860 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.349323 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.366262 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.375608 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.384736 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.392748 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.392775 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.392784 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.392797 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.392808 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:54Z","lastTransitionTime":"2025-10-02T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.393817 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.406507 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.414392 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.426521 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.435954 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.445851 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.454095 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.464756 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.474774 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.489921 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.494624 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.494654 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.494664 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.494675 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.494683 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:54Z","lastTransitionTime":"2025-10-02T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.499759 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.508880 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.519319 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.533541 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4820440d78584ece6e739d7c53f4bd2f4bb602d3c5c269863051ead4f7ee09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.543391 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.554536 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.563163 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.575206 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.585672 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.596525 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.596559 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.596568 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.596580 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.596589 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:54Z","lastTransitionTime":"2025-10-02T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.596732 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.604735 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.613503 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.620527 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:54Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.698330 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.698357 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.698367 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.698379 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.698387 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:54Z","lastTransitionTime":"2025-10-02T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.800488 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.800517 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.800526 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.800540 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.800548 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:54Z","lastTransitionTime":"2025-10-02T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.902400 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.902429 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.902437 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.902447 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:54 crc kubenswrapper[4786]: I1002 06:46:54.902455 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:54Z","lastTransitionTime":"2025-10-02T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.004464 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.004499 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.004509 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.004521 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.004531 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:55Z","lastTransitionTime":"2025-10-02T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.105900 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.105929 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.105937 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.105948 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.105963 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:55Z","lastTransitionTime":"2025-10-02T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.208245 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.208273 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.208282 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.208291 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.208298 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:55Z","lastTransitionTime":"2025-10-02T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.301643 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" event={"ID":"b625fb23-ba7e-4931-b753-94dc23e8effa","Type":"ContainerStarted","Data":"9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777"} Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.301680 4786 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.301964 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.310273 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.310304 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.310313 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.310325 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.310334 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:55Z","lastTransitionTime":"2025-10-02T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.310717 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.318515 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.319196 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.326656 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.336190 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.344848 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.352918 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.360315 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.368486 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.374791 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.388762 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.398020 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.405860 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.412279 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.412305 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.412313 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.412323 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.412331 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:55Z","lastTransitionTime":"2025-10-02T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.414522 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.426831 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4820440d78584ece6e739d7c53f4bd2f4bb602d3c5c269863051ead4f7ee09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.433216 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.442389 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.450095 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.457646 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.465794 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.478169 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.488918 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.497505 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.506702 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.513795 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.513823 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.513831 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.513843 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.513852 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:55Z","lastTransitionTime":"2025-10-02T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.520572 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4820440d78584ece6e739d7c53f4bd2f4bb602d3c5c269863051ead4f7ee09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.531489 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.549990 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.561133 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.574583 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:55Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.616316 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.616347 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.616355 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.616367 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.616376 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:55Z","lastTransitionTime":"2025-10-02T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.718232 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.718265 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.718275 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.718287 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.718295 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:55Z","lastTransitionTime":"2025-10-02T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.819774 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.819809 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.819818 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.819830 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.819838 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:55Z","lastTransitionTime":"2025-10-02T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.887932 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.888024 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.888047 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:46:55 crc kubenswrapper[4786]: E1002 06:46:55.888095 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:47:03.88807506 +0000 UTC m=+34.009258202 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:46:55 crc kubenswrapper[4786]: E1002 06:46:55.888105 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 06:46:55 crc kubenswrapper[4786]: E1002 06:46:55.888120 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.888160 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:46:55 crc kubenswrapper[4786]: E1002 06:46:55.888167 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 06:47:03.888152786 +0000 UTC m=+34.009335917 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 06:46:55 crc kubenswrapper[4786]: E1002 06:46:55.888213 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 06:47:03.888204333 +0000 UTC m=+34.009387474 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 06:46:55 crc kubenswrapper[4786]: E1002 06:46:55.888293 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 06:46:55 crc kubenswrapper[4786]: E1002 06:46:55.888305 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 06:46:55 crc kubenswrapper[4786]: E1002 06:46:55.888316 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:55 crc kubenswrapper[4786]: E1002 06:46:55.888354 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 06:47:03.888346829 +0000 UTC m=+34.009529970 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.922149 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.922176 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.922186 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.922198 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.922209 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:55Z","lastTransitionTime":"2025-10-02T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:55 crc kubenswrapper[4786]: I1002 06:46:55.988766 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:46:55 crc kubenswrapper[4786]: E1002 06:46:55.988968 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 06:46:55 crc kubenswrapper[4786]: E1002 06:46:55.988999 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 06:46:55 crc kubenswrapper[4786]: E1002 06:46:55.989009 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:55 crc kubenswrapper[4786]: E1002 06:46:55.989054 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 06:47:03.989041461 +0000 UTC m=+34.110224591 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.023985 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.024014 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.024022 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.024036 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.024045 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:56Z","lastTransitionTime":"2025-10-02T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.125779 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.125832 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.125843 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.125855 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.125864 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:56Z","lastTransitionTime":"2025-10-02T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.178433 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.178458 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.178465 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:46:56 crc kubenswrapper[4786]: E1002 06:46:56.178523 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:46:56 crc kubenswrapper[4786]: E1002 06:46:56.178594 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:46:56 crc kubenswrapper[4786]: E1002 06:46:56.178644 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.228123 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.228151 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.228159 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.228171 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.228180 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:56Z","lastTransitionTime":"2025-10-02T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.304931 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bgs8z_894eab78-90cf-4975-aa45-223332e04f5c/ovnkube-controller/0.log" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.307141 4786 generic.go:334] "Generic (PLEG): container finished" podID="894eab78-90cf-4975-aa45-223332e04f5c" containerID="7d4820440d78584ece6e739d7c53f4bd2f4bb602d3c5c269863051ead4f7ee09" exitCode=1 Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.307172 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerDied","Data":"7d4820440d78584ece6e739d7c53f4bd2f4bb602d3c5c269863051ead4f7ee09"} Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.307594 4786 scope.go:117] "RemoveContainer" containerID="7d4820440d78584ece6e739d7c53f4bd2f4bb602d3c5c269863051ead4f7ee09" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.317541 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.327752 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.330041 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.330065 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.330073 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.330084 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.330092 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:56Z","lastTransitionTime":"2025-10-02T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.335769 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.344368 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.350842 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.365201 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.373877 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.381859 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.390755 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.402784 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d4820440d78584ece6e739d7c53f4bd2f4bb602d3c5c269863051ead4f7ee09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4820440d78584ece6e739d7c53f4bd2f4bb602d3c5c269863051ead4f7ee09\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:46:56Z\\\",\\\"message\\\":\\\"ice (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 06:46:56.011135 6050 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 06:46:56.011276 6050 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 06:46:56.011605 6050 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 06:46:56.011620 6050 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 06:46:56.011624 6050 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 06:46:56.011632 6050 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 06:46:56.011639 6050 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 06:46:56.011665 6050 factory.go:656] Stopping watch factory\\\\nI1002 06:46:56.011677 6050 ovnkube.go:599] Stopped ovnkube\\\\nI1002 06:46:56.011709 6050 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 06:46:56.011716 6050 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 06:46:56.011726 6050 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 06:46:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.411013 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.418860 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.426052 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.431458 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.431485 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.431494 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.431507 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.431516 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:56Z","lastTransitionTime":"2025-10-02T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.437540 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.533198 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.533231 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.533240 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.533253 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.533261 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:56Z","lastTransitionTime":"2025-10-02T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.634988 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.635018 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.635026 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.635038 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.635046 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:56Z","lastTransitionTime":"2025-10-02T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.737398 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.737427 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.737443 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.737456 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.737465 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:56Z","lastTransitionTime":"2025-10-02T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.838947 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.838974 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.838982 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.838993 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.839001 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:56Z","lastTransitionTime":"2025-10-02T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.940399 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.940426 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.940434 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.940445 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:56 crc kubenswrapper[4786]: I1002 06:46:56.940453 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:56Z","lastTransitionTime":"2025-10-02T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.042119 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.042154 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.042166 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.042179 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.042187 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:57Z","lastTransitionTime":"2025-10-02T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.143224 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.143249 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.143257 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.143269 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.143275 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:57Z","lastTransitionTime":"2025-10-02T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.245427 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.245451 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.245459 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.245468 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.245477 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:57Z","lastTransitionTime":"2025-10-02T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.309999 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bgs8z_894eab78-90cf-4975-aa45-223332e04f5c/ovnkube-controller/1.log" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.310539 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bgs8z_894eab78-90cf-4975-aa45-223332e04f5c/ovnkube-controller/0.log" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.312607 4786 generic.go:334] "Generic (PLEG): container finished" podID="894eab78-90cf-4975-aa45-223332e04f5c" containerID="c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a" exitCode=1 Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.312633 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerDied","Data":"c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a"} Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.312668 4786 scope.go:117] "RemoveContainer" containerID="7d4820440d78584ece6e739d7c53f4bd2f4bb602d3c5c269863051ead4f7ee09" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.313098 4786 scope.go:117] "RemoveContainer" containerID="c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a" Oct 02 06:46:57 crc kubenswrapper[4786]: E1002 06:46:57.313220 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" podUID="894eab78-90cf-4975-aa45-223332e04f5c" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.326242 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:57Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.334684 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:57Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.342488 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:57Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.346872 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.346913 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.346933 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.346943 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.346950 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:57Z","lastTransitionTime":"2025-10-02T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.350498 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:57Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.361999 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d4820440d78584ece6e739d7c53f4bd2f4bb602d3c5c269863051ead4f7ee09\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:46:56Z\\\",\\\"message\\\":\\\"ice (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 06:46:56.011135 6050 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1002 06:46:56.011276 6050 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 06:46:56.011605 6050 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1002 06:46:56.011620 6050 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 06:46:56.011624 6050 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 06:46:56.011632 6050 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 06:46:56.011639 6050 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1002 06:46:56.011665 6050 factory.go:656] Stopping watch factory\\\\nI1002 06:46:56.011677 6050 ovnkube.go:599] Stopped ovnkube\\\\nI1002 06:46:56.011709 6050 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 06:46:56.011716 6050 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1002 06:46:56.011726 6050 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 06:46:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:46:56Z\\\",\\\"message\\\":\\\"34] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI1002 06:46:56.885514 6178 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nF1002 06:46:56.885521 6178 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 06:46:56.885524 6178 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]stri\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:57Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.369668 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:57Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.377302 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:57Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.383954 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:57Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.392471 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:57Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.400598 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:57Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.407859 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:57Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.414493 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:57Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.421959 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:57Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.428130 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:57Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.448341 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.448378 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.448390 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.448404 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.448412 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:57Z","lastTransitionTime":"2025-10-02T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.550163 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.550189 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.550198 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.550209 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.550218 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:57Z","lastTransitionTime":"2025-10-02T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.652351 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.652374 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.652383 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.652392 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.652399 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:57Z","lastTransitionTime":"2025-10-02T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.753769 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.753793 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.753804 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.753815 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.753824 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:57Z","lastTransitionTime":"2025-10-02T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.855606 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.855635 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.855644 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.855666 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.855673 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:57Z","lastTransitionTime":"2025-10-02T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.957427 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.957450 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.957458 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.957466 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:57 crc kubenswrapper[4786]: I1002 06:46:57.957474 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:57Z","lastTransitionTime":"2025-10-02T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.061444 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.061485 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.061495 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.061510 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.061519 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:58Z","lastTransitionTime":"2025-10-02T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.163031 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.163143 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.163201 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.163259 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.163319 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:58Z","lastTransitionTime":"2025-10-02T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.178184 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:46:58 crc kubenswrapper[4786]: E1002 06:46:58.178278 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.178298 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.178335 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:46:58 crc kubenswrapper[4786]: E1002 06:46:58.178399 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:46:58 crc kubenswrapper[4786]: E1002 06:46:58.178537 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.264928 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.264956 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.264983 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.264994 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.265004 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:58Z","lastTransitionTime":"2025-10-02T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.317213 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bgs8z_894eab78-90cf-4975-aa45-223332e04f5c/ovnkube-controller/1.log" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.320386 4786 scope.go:117] "RemoveContainer" containerID="c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a" Oct 02 06:46:58 crc kubenswrapper[4786]: E1002 06:46:58.320563 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" podUID="894eab78-90cf-4975-aa45-223332e04f5c" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.329550 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:58Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.336850 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:58Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.346260 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:58Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.354114 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:58Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.361137 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:58Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.366257 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.366284 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.366294 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.366305 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.366319 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:58Z","lastTransitionTime":"2025-10-02T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.368732 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:58Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.375564 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:58Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.384109 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:58Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.392327 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:58Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.401243 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:58Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.414492 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:58Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.422584 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:58Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.434233 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:46:56Z\\\",\\\"message\\\":\\\"34] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI1002 06:46:56.885514 6178 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nF1002 06:46:56.885521 6178 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 06:46:56.885524 6178 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]stri\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:58Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.442457 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:58Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.467710 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.467741 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.467750 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.467763 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.467772 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:58Z","lastTransitionTime":"2025-10-02T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.569013 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.569040 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.569049 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.569060 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.569068 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:58Z","lastTransitionTime":"2025-10-02T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.671644 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.671675 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.671706 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.671720 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.671729 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:58Z","lastTransitionTime":"2025-10-02T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.773210 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.773312 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.773383 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.773452 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.773506 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:58Z","lastTransitionTime":"2025-10-02T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.875044 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.875077 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.875087 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.875100 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.875109 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:58Z","lastTransitionTime":"2025-10-02T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.977033 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.977058 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.977066 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.977078 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:58 crc kubenswrapper[4786]: I1002 06:46:58.977085 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:58Z","lastTransitionTime":"2025-10-02T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.078922 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.078950 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.078958 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.078967 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.078974 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:59Z","lastTransitionTime":"2025-10-02T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.180304 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.180336 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.180344 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.180356 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.180364 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:59Z","lastTransitionTime":"2025-10-02T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.281673 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.281716 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.281725 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.281734 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.281741 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:59Z","lastTransitionTime":"2025-10-02T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.383748 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.383774 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.383782 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.383792 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.383800 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:59Z","lastTransitionTime":"2025-10-02T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.444613 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds"] Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.444931 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.446655 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.447204 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.456504 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:59Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.464769 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:59Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.472023 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:59Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.481480 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:59Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.484983 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.485017 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.485026 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.485040 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.485050 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:59Z","lastTransitionTime":"2025-10-02T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.490581 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:59Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.498794 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:59Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.505820 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:59Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.513245 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:59Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.519805 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:59Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.531943 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:59Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.539511 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:59Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.547251 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:59Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.554747 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:59Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.566979 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:46:56Z\\\",\\\"message\\\":\\\"34] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI1002 06:46:56.885514 6178 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nF1002 06:46:56.885521 6178 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 06:46:56.885524 6178 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]stri\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:59Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.587049 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.587076 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.587084 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.587098 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.587107 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:59Z","lastTransitionTime":"2025-10-02T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.591859 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:59Z is after 2025-08-24T17:21:41Z" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.617240 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94spq\" (UniqueName: \"kubernetes.io/projected/994073bb-c2bd-4b85-8807-91891f492145-kube-api-access-94spq\") pod \"ovnkube-control-plane-749d76644c-2v2ds\" (UID: \"994073bb-c2bd-4b85-8807-91891f492145\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.617290 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/994073bb-c2bd-4b85-8807-91891f492145-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2v2ds\" (UID: \"994073bb-c2bd-4b85-8807-91891f492145\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.617321 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/994073bb-c2bd-4b85-8807-91891f492145-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2v2ds\" (UID: \"994073bb-c2bd-4b85-8807-91891f492145\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.617341 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/994073bb-c2bd-4b85-8807-91891f492145-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2v2ds\" (UID: \"994073bb-c2bd-4b85-8807-91891f492145\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.689193 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.689222 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.689231 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.689245 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.689254 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:59Z","lastTransitionTime":"2025-10-02T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.718595 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/994073bb-c2bd-4b85-8807-91891f492145-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2v2ds\" (UID: \"994073bb-c2bd-4b85-8807-91891f492145\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.718631 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/994073bb-c2bd-4b85-8807-91891f492145-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2v2ds\" (UID: \"994073bb-c2bd-4b85-8807-91891f492145\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.718665 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94spq\" (UniqueName: \"kubernetes.io/projected/994073bb-c2bd-4b85-8807-91891f492145-kube-api-access-94spq\") pod \"ovnkube-control-plane-749d76644c-2v2ds\" (UID: \"994073bb-c2bd-4b85-8807-91891f492145\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.718712 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/994073bb-c2bd-4b85-8807-91891f492145-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2v2ds\" (UID: \"994073bb-c2bd-4b85-8807-91891f492145\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.719237 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/994073bb-c2bd-4b85-8807-91891f492145-env-overrides\") pod \"ovnkube-control-plane-749d76644c-2v2ds\" (UID: \"994073bb-c2bd-4b85-8807-91891f492145\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.719258 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/994073bb-c2bd-4b85-8807-91891f492145-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-2v2ds\" (UID: \"994073bb-c2bd-4b85-8807-91891f492145\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.722797 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/994073bb-c2bd-4b85-8807-91891f492145-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-2v2ds\" (UID: \"994073bb-c2bd-4b85-8807-91891f492145\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.731952 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94spq\" (UniqueName: \"kubernetes.io/projected/994073bb-c2bd-4b85-8807-91891f492145-kube-api-access-94spq\") pod \"ovnkube-control-plane-749d76644c-2v2ds\" (UID: \"994073bb-c2bd-4b85-8807-91891f492145\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.754739 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" Oct 02 06:46:59 crc kubenswrapper[4786]: W1002 06:46:59.764376 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod994073bb_c2bd_4b85_8807_91891f492145.slice/crio-e428dc0610e3af702d211dd744bf490ec9df115dc0c769e797628311d8493f72 WatchSource:0}: Error finding container e428dc0610e3af702d211dd744bf490ec9df115dc0c769e797628311d8493f72: Status 404 returned error can't find the container with id e428dc0610e3af702d211dd744bf490ec9df115dc0c769e797628311d8493f72 Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.791247 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.791270 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.791278 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.791290 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.791298 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:59Z","lastTransitionTime":"2025-10-02T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.893165 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.893197 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.893206 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.893218 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.893226 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:59Z","lastTransitionTime":"2025-10-02T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.994938 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.994969 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.994977 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.994989 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:46:59 crc kubenswrapper[4786]: I1002 06:46:59.994997 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:46:59Z","lastTransitionTime":"2025-10-02T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.096405 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.096600 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.096609 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.096635 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.096645 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:00Z","lastTransitionTime":"2025-10-02T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.178960 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.178987 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.178987 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:00 crc kubenswrapper[4786]: E1002 06:47:00.179049 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:00 crc kubenswrapper[4786]: E1002 06:47:00.179150 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:00 crc kubenswrapper[4786]: E1002 06:47:00.179214 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.192418 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.197893 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.197922 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.197933 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.197952 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.197962 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:00Z","lastTransitionTime":"2025-10-02T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.200228 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.208613 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.219299 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.231570 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:46:56Z\\\",\\\"message\\\":\\\"34] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI1002 06:46:56.885514 6178 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nF1002 06:46:56.885521 6178 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 06:46:56.885524 6178 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]stri\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.239039 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.250325 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.258953 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.267106 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.274467 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.281101 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.290816 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.298675 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.299581 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.299603 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.299614 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.299626 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.299636 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:00Z","lastTransitionTime":"2025-10-02T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.306615 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.314951 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.325116 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" event={"ID":"994073bb-c2bd-4b85-8807-91891f492145","Type":"ContainerStarted","Data":"e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81"} Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.325147 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" event={"ID":"994073bb-c2bd-4b85-8807-91891f492145","Type":"ContainerStarted","Data":"c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e"} Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.325159 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" event={"ID":"994073bb-c2bd-4b85-8807-91891f492145","Type":"ContainerStarted","Data":"e428dc0610e3af702d211dd744bf490ec9df115dc0c769e797628311d8493f72"} Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.333733 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.345960 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:46:56Z\\\",\\\"message\\\":\\\"34] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI1002 06:46:56.885514 6178 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nF1002 06:46:56.885521 6178 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 06:46:56.885524 6178 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]stri\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.353034 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.362225 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.370094 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.377915 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.384769 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.391107 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.400157 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.401021 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.401053 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.401063 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.401075 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.401082 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:00Z","lastTransitionTime":"2025-10-02T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.408232 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.415639 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.423219 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.437610 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.445816 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.453684 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.502850 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.502888 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.502897 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.502917 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.502926 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:00Z","lastTransitionTime":"2025-10-02T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.604677 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.604802 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.604884 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.604954 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.605010 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:00Z","lastTransitionTime":"2025-10-02T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.704573 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.706747 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.706766 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.706774 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.706782 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.706789 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:00Z","lastTransitionTime":"2025-10-02T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.712998 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.721309 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.734275 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:46:56Z\\\",\\\"message\\\":\\\"34] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI1002 06:46:56.885514 6178 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nF1002 06:46:56.885521 6178 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 06:46:56.885524 6178 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]stri\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.741524 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.750736 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.758432 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.766907 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.775273 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.781943 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.790086 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.797890 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.805001 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.807895 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.807923 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.807933 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.807946 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.807955 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:00Z","lastTransitionTime":"2025-10-02T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.817258 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.824726 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.832278 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.850760 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-9h5tj"] Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.851029 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-p8zkp"] Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.851152 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9h5tj" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.851257 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:00 crc kubenswrapper[4786]: E1002 06:47:00.851300 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.852306 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.853720 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.853730 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.853784 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.857311 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.857341 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.857352 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.857363 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.857372 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:00Z","lastTransitionTime":"2025-10-02T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.865518 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: E1002 06:47:00.865633 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.868220 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.868247 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.868255 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.868268 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.868276 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:00Z","lastTransitionTime":"2025-10-02T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.873493 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: E1002 06:47:00.876245 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.880679 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.880716 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.880725 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.880736 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.880745 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:00Z","lastTransitionTime":"2025-10-02T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.883216 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.891499 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h5tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d5f210-1181-43ab-b65a-12ba1f5a9255\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9mj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h5tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: E1002 06:47:00.891863 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.894314 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.894344 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.894352 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.894365 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.894373 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:00Z","lastTransitionTime":"2025-10-02T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.900287 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: E1002 06:47:00.902028 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.904132 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.904153 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.904160 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.904170 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.904178 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:00Z","lastTransitionTime":"2025-10-02T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:00 crc kubenswrapper[4786]: E1002 06:47:00.911600 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: E1002 06:47:00.911720 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.912614 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.912655 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.912664 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.912674 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.912681 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:00Z","lastTransitionTime":"2025-10-02T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.913918 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:46:56Z\\\",\\\"message\\\":\\\"34] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI1002 06:46:56.885514 6178 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nF1002 06:46:56.885521 6178 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 06:46:56.885524 6178 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]stri\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.921020 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.927402 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e4217c0-9581-4727-b594-adb99293f7db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8zkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.935426 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.943115 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.950408 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.959221 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.968044 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.975803 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.982843 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.990418 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:00 crc kubenswrapper[4786]: I1002 06:47:00.996674 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:00Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.008794 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:01Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.014221 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.014246 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.014256 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.014267 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.014276 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:01Z","lastTransitionTime":"2025-10-02T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.016228 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:01Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.023608 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:01Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.028090 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9mj7\" (UniqueName: \"kubernetes.io/projected/09d5f210-1181-43ab-b65a-12ba1f5a9255-kube-api-access-g9mj7\") pod \"node-ca-9h5tj\" (UID: \"09d5f210-1181-43ab-b65a-12ba1f5a9255\") " pod="openshift-image-registry/node-ca-9h5tj" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.028139 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs\") pod \"network-metrics-daemon-p8zkp\" (UID: \"6e4217c0-9581-4727-b594-adb99293f7db\") " pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.028170 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldt5z\" (UniqueName: \"kubernetes.io/projected/6e4217c0-9581-4727-b594-adb99293f7db-kube-api-access-ldt5z\") pod \"network-metrics-daemon-p8zkp\" (UID: \"6e4217c0-9581-4727-b594-adb99293f7db\") " pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.028186 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09d5f210-1181-43ab-b65a-12ba1f5a9255-host\") pod \"node-ca-9h5tj\" (UID: \"09d5f210-1181-43ab-b65a-12ba1f5a9255\") " pod="openshift-image-registry/node-ca-9h5tj" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.028200 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/09d5f210-1181-43ab-b65a-12ba1f5a9255-serviceca\") pod \"node-ca-9h5tj\" (UID: \"09d5f210-1181-43ab-b65a-12ba1f5a9255\") " pod="openshift-image-registry/node-ca-9h5tj" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.029671 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h5tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d5f210-1181-43ab-b65a-12ba1f5a9255\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9mj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h5tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:01Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.037346 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:01Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.048727 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:46:56Z\\\",\\\"message\\\":\\\"34] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI1002 06:46:56.885514 6178 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nF1002 06:46:56.885521 6178 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 06:46:56.885524 6178 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]stri\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:01Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.055470 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:01Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.066812 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:01Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.106478 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:01Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.115785 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.115832 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.115842 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.115855 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.115864 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:01Z","lastTransitionTime":"2025-10-02T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.128518 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs\") pod \"network-metrics-daemon-p8zkp\" (UID: \"6e4217c0-9581-4727-b594-adb99293f7db\") " pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.128565 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldt5z\" (UniqueName: \"kubernetes.io/projected/6e4217c0-9581-4727-b594-adb99293f7db-kube-api-access-ldt5z\") pod \"network-metrics-daemon-p8zkp\" (UID: \"6e4217c0-9581-4727-b594-adb99293f7db\") " pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.128587 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09d5f210-1181-43ab-b65a-12ba1f5a9255-host\") pod \"node-ca-9h5tj\" (UID: \"09d5f210-1181-43ab-b65a-12ba1f5a9255\") " pod="openshift-image-registry/node-ca-9h5tj" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.128605 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/09d5f210-1181-43ab-b65a-12ba1f5a9255-serviceca\") pod \"node-ca-9h5tj\" (UID: \"09d5f210-1181-43ab-b65a-12ba1f5a9255\") " pod="openshift-image-registry/node-ca-9h5tj" Oct 02 06:47:01 crc kubenswrapper[4786]: E1002 06:47:01.128617 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.128643 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9mj7\" (UniqueName: \"kubernetes.io/projected/09d5f210-1181-43ab-b65a-12ba1f5a9255-kube-api-access-g9mj7\") pod \"node-ca-9h5tj\" (UID: \"09d5f210-1181-43ab-b65a-12ba1f5a9255\") " pod="openshift-image-registry/node-ca-9h5tj" Oct 02 06:47:01 crc kubenswrapper[4786]: E1002 06:47:01.128662 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs podName:6e4217c0-9581-4727-b594-adb99293f7db nodeName:}" failed. No retries permitted until 2025-10-02 06:47:01.628648477 +0000 UTC m=+31.749831608 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs") pod "network-metrics-daemon-p8zkp" (UID: "6e4217c0-9581-4727-b594-adb99293f7db") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.128742 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09d5f210-1181-43ab-b65a-12ba1f5a9255-host\") pod \"node-ca-9h5tj\" (UID: \"09d5f210-1181-43ab-b65a-12ba1f5a9255\") " pod="openshift-image-registry/node-ca-9h5tj" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.129599 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/09d5f210-1181-43ab-b65a-12ba1f5a9255-serviceca\") pod \"node-ca-9h5tj\" (UID: \"09d5f210-1181-43ab-b65a-12ba1f5a9255\") " pod="openshift-image-registry/node-ca-9h5tj" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.144921 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:01Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.172131 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldt5z\" (UniqueName: \"kubernetes.io/projected/6e4217c0-9581-4727-b594-adb99293f7db-kube-api-access-ldt5z\") pod \"network-metrics-daemon-p8zkp\" (UID: \"6e4217c0-9581-4727-b594-adb99293f7db\") " pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.191619 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9mj7\" (UniqueName: \"kubernetes.io/projected/09d5f210-1181-43ab-b65a-12ba1f5a9255-kube-api-access-g9mj7\") pod \"node-ca-9h5tj\" (UID: \"09d5f210-1181-43ab-b65a-12ba1f5a9255\") " pod="openshift-image-registry/node-ca-9h5tj" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.217666 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.217720 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.217731 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.217743 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.217752 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:01Z","lastTransitionTime":"2025-10-02T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.227687 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:01Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.264459 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e4217c0-9581-4727-b594-adb99293f7db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8zkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:01Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.307665 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:01Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.319740 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.319767 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.319775 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.319786 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.319795 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:01Z","lastTransitionTime":"2025-10-02T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.346072 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:01Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.384992 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:01Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.421883 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.421929 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.421939 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.421951 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.421959 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:01Z","lastTransitionTime":"2025-10-02T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.427538 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:01Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.460612 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9h5tj" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.465384 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:01Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.523741 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.523774 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.523784 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.523796 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.523806 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:01Z","lastTransitionTime":"2025-10-02T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.626041 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.626067 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.626074 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.626087 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.626095 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:01Z","lastTransitionTime":"2025-10-02T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.633402 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs\") pod \"network-metrics-daemon-p8zkp\" (UID: \"6e4217c0-9581-4727-b594-adb99293f7db\") " pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:01 crc kubenswrapper[4786]: E1002 06:47:01.633507 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 06:47:01 crc kubenswrapper[4786]: E1002 06:47:01.633551 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs podName:6e4217c0-9581-4727-b594-adb99293f7db nodeName:}" failed. No retries permitted until 2025-10-02 06:47:02.63354056 +0000 UTC m=+32.754723691 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs") pod "network-metrics-daemon-p8zkp" (UID: "6e4217c0-9581-4727-b594-adb99293f7db") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.727152 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.727179 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.727188 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.727199 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.727228 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:01Z","lastTransitionTime":"2025-10-02T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.829451 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.829479 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.829489 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.829500 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.829509 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:01Z","lastTransitionTime":"2025-10-02T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.931399 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.931422 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.931430 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.931439 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:01 crc kubenswrapper[4786]: I1002 06:47:01.931447 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:01Z","lastTransitionTime":"2025-10-02T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.033173 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.033198 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.033206 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.033215 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.033223 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:02Z","lastTransitionTime":"2025-10-02T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.135313 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.135488 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.135496 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.135506 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.135512 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:02Z","lastTransitionTime":"2025-10-02T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.178630 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.178717 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.178733 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:02 crc kubenswrapper[4786]: E1002 06:47:02.178830 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.178881 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:02 crc kubenswrapper[4786]: E1002 06:47:02.178938 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:02 crc kubenswrapper[4786]: E1002 06:47:02.179020 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:02 crc kubenswrapper[4786]: E1002 06:47:02.179051 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.236641 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.236668 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.236680 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.236712 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.236721 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:02Z","lastTransitionTime":"2025-10-02T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.330439 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9h5tj" event={"ID":"09d5f210-1181-43ab-b65a-12ba1f5a9255","Type":"ContainerStarted","Data":"f9611578bcef190486262cb116fbd569bedfb39989f1ff19839e06044c4d5558"} Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.330480 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9h5tj" event={"ID":"09d5f210-1181-43ab-b65a-12ba1f5a9255","Type":"ContainerStarted","Data":"2c30380c72697d62a43e8addde9215b5b5b464f9eb8ad23df14f9a75e9497592"} Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.337944 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.337970 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.337978 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.337988 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.337996 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:02Z","lastTransitionTime":"2025-10-02T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.340933 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:02Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.347881 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:02Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.356279 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:02Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.365371 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:02Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.372774 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:02Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.378919 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h5tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d5f210-1181-43ab-b65a-12ba1f5a9255\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9611578bcef190486262cb116fbd569bedfb39989f1ff19839e06044c4d5558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9mj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h5tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:02Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.391494 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:02Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.398946 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:02Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.408278 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:02Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.415381 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:02Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.423320 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:02Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.435200 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:46:56Z\\\",\\\"message\\\":\\\"34] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI1002 06:46:56.885514 6178 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nF1002 06:46:56.885521 6178 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 06:46:56.885524 6178 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]stri\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:02Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.439868 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.439907 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.439917 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.439929 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.439937 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:02Z","lastTransitionTime":"2025-10-02T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.442444 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:02Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.451919 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:02Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.458565 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e4217c0-9581-4727-b594-adb99293f7db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8zkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:02Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.466928 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:02Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.475227 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:02Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.541473 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.541501 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.541510 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.541521 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.541529 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:02Z","lastTransitionTime":"2025-10-02T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.640144 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs\") pod \"network-metrics-daemon-p8zkp\" (UID: \"6e4217c0-9581-4727-b594-adb99293f7db\") " pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:02 crc kubenswrapper[4786]: E1002 06:47:02.640235 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 06:47:02 crc kubenswrapper[4786]: E1002 06:47:02.640278 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs podName:6e4217c0-9581-4727-b594-adb99293f7db nodeName:}" failed. No retries permitted until 2025-10-02 06:47:04.640266602 +0000 UTC m=+34.761449734 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs") pod "network-metrics-daemon-p8zkp" (UID: "6e4217c0-9581-4727-b594-adb99293f7db") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.642994 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.643018 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.643026 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.643039 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.643047 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:02Z","lastTransitionTime":"2025-10-02T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.744445 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.744472 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.744482 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.744494 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.744502 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:02Z","lastTransitionTime":"2025-10-02T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.845837 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.845871 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.845880 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.845890 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.845900 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:02Z","lastTransitionTime":"2025-10-02T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.947017 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.947043 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.947051 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.947063 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:02 crc kubenswrapper[4786]: I1002 06:47:02.947071 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:02Z","lastTransitionTime":"2025-10-02T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.048760 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.048787 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.048796 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.048807 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.048815 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:03Z","lastTransitionTime":"2025-10-02T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.150429 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.150465 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.150475 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.150486 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.150493 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:03Z","lastTransitionTime":"2025-10-02T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.251995 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.252031 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.252042 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.252055 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.252065 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:03Z","lastTransitionTime":"2025-10-02T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.353786 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.353821 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.353829 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.353871 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.353880 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:03Z","lastTransitionTime":"2025-10-02T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.455760 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.456125 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.456195 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.456262 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.456327 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:03Z","lastTransitionTime":"2025-10-02T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.558176 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.558274 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.558330 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.558398 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.558465 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:03Z","lastTransitionTime":"2025-10-02T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.660303 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.660328 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.660336 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.660346 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.660352 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:03Z","lastTransitionTime":"2025-10-02T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.761876 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.761900 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.761908 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.761918 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.761925 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:03Z","lastTransitionTime":"2025-10-02T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.863760 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.863778 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.863786 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.863795 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.863803 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:03Z","lastTransitionTime":"2025-10-02T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.913115 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.913855 4786 scope.go:117] "RemoveContainer" containerID="c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a" Oct 02 06:47:03 crc kubenswrapper[4786]: E1002 06:47:03.913971 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" podUID="894eab78-90cf-4975-aa45-223332e04f5c" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.950596 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:47:03 crc kubenswrapper[4786]: E1002 06:47:03.950671 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:47:19.950658411 +0000 UTC m=+50.071841532 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.950899 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.950960 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.951021 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:03 crc kubenswrapper[4786]: E1002 06:47:03.951313 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 06:47:03 crc kubenswrapper[4786]: E1002 06:47:03.951414 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 06:47:03 crc kubenswrapper[4786]: E1002 06:47:03.951498 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:47:03 crc kubenswrapper[4786]: E1002 06:47:03.951322 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 06:47:03 crc kubenswrapper[4786]: E1002 06:47:03.951344 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 06:47:03 crc kubenswrapper[4786]: E1002 06:47:03.951713 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 06:47:19.951699289 +0000 UTC m=+50.072882620 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:47:03 crc kubenswrapper[4786]: E1002 06:47:03.951796 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 06:47:19.95178022 +0000 UTC m=+50.072963352 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 06:47:03 crc kubenswrapper[4786]: E1002 06:47:03.951818 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 06:47:19.95181197 +0000 UTC m=+50.072995101 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.965348 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.965383 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.965392 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.965403 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:03 crc kubenswrapper[4786]: I1002 06:47:03.965411 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:03Z","lastTransitionTime":"2025-10-02T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.051825 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:04 crc kubenswrapper[4786]: E1002 06:47:04.051967 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 06:47:04 crc kubenswrapper[4786]: E1002 06:47:04.051987 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 06:47:04 crc kubenswrapper[4786]: E1002 06:47:04.051998 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:47:04 crc kubenswrapper[4786]: E1002 06:47:04.052037 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 06:47:20.052027655 +0000 UTC m=+50.173210786 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.066890 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.066921 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.066930 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.066943 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.066951 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:04Z","lastTransitionTime":"2025-10-02T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.168985 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.169026 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.169044 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.169055 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.169062 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:04Z","lastTransitionTime":"2025-10-02T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.178240 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:04 crc kubenswrapper[4786]: E1002 06:47:04.178335 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.178592 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:04 crc kubenswrapper[4786]: E1002 06:47:04.178659 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.178726 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:04 crc kubenswrapper[4786]: E1002 06:47:04.178773 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.178961 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:04 crc kubenswrapper[4786]: E1002 06:47:04.179087 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.271125 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.271159 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.271170 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.271183 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.271192 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:04Z","lastTransitionTime":"2025-10-02T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.372442 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.372471 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.372480 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.372489 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.372496 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:04Z","lastTransitionTime":"2025-10-02T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.474558 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.474589 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.474598 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.474610 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.474618 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:04Z","lastTransitionTime":"2025-10-02T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.576050 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.576081 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.576091 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.576102 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.576110 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:04Z","lastTransitionTime":"2025-10-02T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.657445 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs\") pod \"network-metrics-daemon-p8zkp\" (UID: \"6e4217c0-9581-4727-b594-adb99293f7db\") " pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:04 crc kubenswrapper[4786]: E1002 06:47:04.657565 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 06:47:04 crc kubenswrapper[4786]: E1002 06:47:04.657643 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs podName:6e4217c0-9581-4727-b594-adb99293f7db nodeName:}" failed. No retries permitted until 2025-10-02 06:47:08.657623888 +0000 UTC m=+38.778807039 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs") pod "network-metrics-daemon-p8zkp" (UID: "6e4217c0-9581-4727-b594-adb99293f7db") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.677667 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.677710 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.677719 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.677729 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.677736 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:04Z","lastTransitionTime":"2025-10-02T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.779276 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.779306 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.779315 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.779327 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.779335 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:04Z","lastTransitionTime":"2025-10-02T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.881594 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.881623 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.881631 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.881641 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.881669 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:04Z","lastTransitionTime":"2025-10-02T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.983450 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.983480 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.983488 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.983499 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:04 crc kubenswrapper[4786]: I1002 06:47:04.983508 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:04Z","lastTransitionTime":"2025-10-02T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.084661 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.084704 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.084714 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.084724 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.084731 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:05Z","lastTransitionTime":"2025-10-02T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.186005 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.186034 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.186043 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.186053 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.186059 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:05Z","lastTransitionTime":"2025-10-02T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.287583 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.287606 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.287615 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.287624 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.287633 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:05Z","lastTransitionTime":"2025-10-02T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.389450 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.389476 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.389483 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.389492 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.389500 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:05Z","lastTransitionTime":"2025-10-02T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.491228 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.491282 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.491293 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.491305 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.491315 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:05Z","lastTransitionTime":"2025-10-02T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.592858 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.592886 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.592895 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.592905 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.592911 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:05Z","lastTransitionTime":"2025-10-02T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.694512 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.694542 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.694550 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.694560 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.694568 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:05Z","lastTransitionTime":"2025-10-02T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.796254 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.796283 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.796291 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.796302 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.796310 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:05Z","lastTransitionTime":"2025-10-02T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.898281 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.898332 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.898343 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.898355 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:05 crc kubenswrapper[4786]: I1002 06:47:05.898364 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:05Z","lastTransitionTime":"2025-10-02T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.000374 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.000397 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.000407 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.000417 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.000425 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:06Z","lastTransitionTime":"2025-10-02T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.102363 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.102398 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.102408 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.102419 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.102427 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:06Z","lastTransitionTime":"2025-10-02T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.178542 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.178575 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:06 crc kubenswrapper[4786]: E1002 06:47:06.178638 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.178650 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.178679 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:06 crc kubenswrapper[4786]: E1002 06:47:06.178764 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:06 crc kubenswrapper[4786]: E1002 06:47:06.178828 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:06 crc kubenswrapper[4786]: E1002 06:47:06.178871 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.203894 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.203950 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.203960 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.203970 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.203979 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:06Z","lastTransitionTime":"2025-10-02T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.305936 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.305959 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.305969 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.305979 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.305988 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:06Z","lastTransitionTime":"2025-10-02T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.408154 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.408195 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.408205 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.408216 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.408226 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:06Z","lastTransitionTime":"2025-10-02T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.509833 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.509867 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.509878 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.509891 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.509900 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:06Z","lastTransitionTime":"2025-10-02T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.611477 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.611526 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.611535 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.611547 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.611555 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:06Z","lastTransitionTime":"2025-10-02T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.713498 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.713536 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.713546 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.713558 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.713588 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:06Z","lastTransitionTime":"2025-10-02T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.814792 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.814840 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.814850 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.814862 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.814873 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:06Z","lastTransitionTime":"2025-10-02T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.916711 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.916746 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.916755 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.916768 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:06 crc kubenswrapper[4786]: I1002 06:47:06.916776 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:06Z","lastTransitionTime":"2025-10-02T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.018323 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.018347 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.018355 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.018366 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.018374 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:07Z","lastTransitionTime":"2025-10-02T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.120437 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.120465 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.120474 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.120485 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.120494 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:07Z","lastTransitionTime":"2025-10-02T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.222841 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.222869 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.222878 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.222889 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.222897 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:07Z","lastTransitionTime":"2025-10-02T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.324044 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.324072 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.324081 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.324091 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.324099 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:07Z","lastTransitionTime":"2025-10-02T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.425755 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.425793 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.425802 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.425829 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.425839 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:07Z","lastTransitionTime":"2025-10-02T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.527479 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.527509 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.527517 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.527527 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.527535 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:07Z","lastTransitionTime":"2025-10-02T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.629597 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.629619 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.629628 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.629637 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.629645 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:07Z","lastTransitionTime":"2025-10-02T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.731470 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.731505 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.731515 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.731531 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.731542 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:07Z","lastTransitionTime":"2025-10-02T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.832867 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.832987 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.833057 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.833123 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.833183 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:07Z","lastTransitionTime":"2025-10-02T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.934266 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.934287 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.934296 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.934307 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:07 crc kubenswrapper[4786]: I1002 06:47:07.934314 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:07Z","lastTransitionTime":"2025-10-02T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.036060 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.036090 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.036101 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.036112 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.036121 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:08Z","lastTransitionTime":"2025-10-02T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.137961 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.137997 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.138007 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.138020 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.138029 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:08Z","lastTransitionTime":"2025-10-02T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.178789 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.178863 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:08 crc kubenswrapper[4786]: E1002 06:47:08.178962 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.179032 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.179055 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:08 crc kubenswrapper[4786]: E1002 06:47:08.179128 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:08 crc kubenswrapper[4786]: E1002 06:47:08.179162 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:08 crc kubenswrapper[4786]: E1002 06:47:08.179226 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.239835 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.239863 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.239873 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.239883 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.239890 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:08Z","lastTransitionTime":"2025-10-02T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.342079 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.342107 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.342116 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.342126 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.342140 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:08Z","lastTransitionTime":"2025-10-02T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.444429 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.444472 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.444480 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.444491 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.444500 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:08Z","lastTransitionTime":"2025-10-02T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.545928 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.545978 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.545987 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.546001 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.546009 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:08Z","lastTransitionTime":"2025-10-02T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.647943 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.647966 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.647973 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.647985 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.648008 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:08Z","lastTransitionTime":"2025-10-02T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.689587 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs\") pod \"network-metrics-daemon-p8zkp\" (UID: \"6e4217c0-9581-4727-b594-adb99293f7db\") " pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:08 crc kubenswrapper[4786]: E1002 06:47:08.689734 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 06:47:08 crc kubenswrapper[4786]: E1002 06:47:08.689783 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs podName:6e4217c0-9581-4727-b594-adb99293f7db nodeName:}" failed. No retries permitted until 2025-10-02 06:47:16.689764941 +0000 UTC m=+46.810948072 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs") pod "network-metrics-daemon-p8zkp" (UID: "6e4217c0-9581-4727-b594-adb99293f7db") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.750098 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.750126 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.750135 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.750163 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.750171 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:08Z","lastTransitionTime":"2025-10-02T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.851982 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.852016 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.852025 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.852038 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.852048 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:08Z","lastTransitionTime":"2025-10-02T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.954064 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.954098 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.954107 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.954120 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:08 crc kubenswrapper[4786]: I1002 06:47:08.954131 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:08Z","lastTransitionTime":"2025-10-02T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.055735 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.055763 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.055792 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.055803 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.055809 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:09Z","lastTransitionTime":"2025-10-02T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.157301 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.157336 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.157347 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.157360 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.157370 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:09Z","lastTransitionTime":"2025-10-02T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.258496 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.258520 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.258529 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.258540 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.258547 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:09Z","lastTransitionTime":"2025-10-02T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.360453 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.360476 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.360484 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.360495 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.360502 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:09Z","lastTransitionTime":"2025-10-02T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.462061 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.462089 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.462098 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.462107 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.462114 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:09Z","lastTransitionTime":"2025-10-02T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.563584 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.563614 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.563622 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.563632 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.563639 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:09Z","lastTransitionTime":"2025-10-02T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.665439 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.665479 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.665488 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.665497 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.665504 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:09Z","lastTransitionTime":"2025-10-02T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.766780 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.766806 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.766814 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.766823 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.766834 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:09Z","lastTransitionTime":"2025-10-02T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.868463 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.868492 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.868500 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.868508 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.868515 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:09Z","lastTransitionTime":"2025-10-02T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.969515 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.969546 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.969555 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.969567 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:09 crc kubenswrapper[4786]: I1002 06:47:09.969575 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:09Z","lastTransitionTime":"2025-10-02T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.071187 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.071229 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.071237 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.071247 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.071254 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:10Z","lastTransitionTime":"2025-10-02T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.172450 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.172490 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.172499 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.172510 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.172518 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:10Z","lastTransitionTime":"2025-10-02T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.178758 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.178773 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.178762 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:10 crc kubenswrapper[4786]: E1002 06:47:10.178835 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.178862 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:10 crc kubenswrapper[4786]: E1002 06:47:10.178921 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:10 crc kubenswrapper[4786]: E1002 06:47:10.178971 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:10 crc kubenswrapper[4786]: E1002 06:47:10.179012 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.187430 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:10Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.195623 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:10Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.202262 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:10Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.211026 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:10Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.217267 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e4217c0-9581-4727-b594-adb99293f7db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8zkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:10Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.225314 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:10Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.233222 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:10Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.240789 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:10Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.248385 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:10Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.256047 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:10Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.269962 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:10Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.274449 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.274479 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.274487 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.274499 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.274507 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:10Z","lastTransitionTime":"2025-10-02T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.277710 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:10Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.285376 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:10Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.291386 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h5tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d5f210-1181-43ab-b65a-12ba1f5a9255\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9611578bcef190486262cb116fbd569bedfb39989f1ff19839e06044c4d5558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9mj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h5tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:10Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.299344 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:10Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.335404 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:46:56Z\\\",\\\"message\\\":\\\"34] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI1002 06:46:56.885514 6178 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nF1002 06:46:56.885521 6178 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 06:46:56.885524 6178 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]stri\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:10Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.346849 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:10Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.376759 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.376794 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.376803 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.376815 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.376824 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:10Z","lastTransitionTime":"2025-10-02T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.478461 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.478490 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.478501 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.478515 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.478524 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:10Z","lastTransitionTime":"2025-10-02T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.580120 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.580160 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.580169 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.580181 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.580191 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:10Z","lastTransitionTime":"2025-10-02T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.681552 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.681579 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.681590 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.681606 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.681615 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:10Z","lastTransitionTime":"2025-10-02T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.782836 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.782866 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.782876 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.782888 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.782897 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:10Z","lastTransitionTime":"2025-10-02T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.884454 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.884483 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.884500 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.884511 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.884518 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:10Z","lastTransitionTime":"2025-10-02T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.986477 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.986509 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.986519 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.986532 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:10 crc kubenswrapper[4786]: I1002 06:47:10.986540 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:10Z","lastTransitionTime":"2025-10-02T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.088195 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.088221 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.088230 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.088240 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.088249 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:11Z","lastTransitionTime":"2025-10-02T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.186364 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.186392 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.186413 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.186422 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.186429 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:11Z","lastTransitionTime":"2025-10-02T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:11 crc kubenswrapper[4786]: E1002 06:47:11.194530 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:11Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.196618 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.196641 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.196649 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.196659 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.196666 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:11Z","lastTransitionTime":"2025-10-02T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:11 crc kubenswrapper[4786]: E1002 06:47:11.204512 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:11Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.207081 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.207110 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.207118 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.207130 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.207138 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:11Z","lastTransitionTime":"2025-10-02T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:11 crc kubenswrapper[4786]: E1002 06:47:11.215411 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:11Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.219902 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.219928 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.219937 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.219947 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.219954 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:11Z","lastTransitionTime":"2025-10-02T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:11 crc kubenswrapper[4786]: E1002 06:47:11.227592 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:11Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.229470 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.229493 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.229501 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.229510 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.229517 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:11Z","lastTransitionTime":"2025-10-02T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:11 crc kubenswrapper[4786]: E1002 06:47:11.236962 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:11Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:11 crc kubenswrapper[4786]: E1002 06:47:11.237062 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.237825 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.237848 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.237856 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.237864 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.237871 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:11Z","lastTransitionTime":"2025-10-02T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.339680 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.339718 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.339726 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.339746 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.339754 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:11Z","lastTransitionTime":"2025-10-02T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.441250 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.441273 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.441280 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.441290 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.441297 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:11Z","lastTransitionTime":"2025-10-02T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.543012 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.543035 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.543042 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.543051 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.543058 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:11Z","lastTransitionTime":"2025-10-02T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.644198 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.644224 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.644233 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.644243 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.644251 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:11Z","lastTransitionTime":"2025-10-02T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.745861 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.745890 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.745899 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.745910 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.745918 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:11Z","lastTransitionTime":"2025-10-02T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.847465 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.847487 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.847494 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.847502 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.847509 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:11Z","lastTransitionTime":"2025-10-02T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.948935 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.948957 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.948964 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.948973 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:11 crc kubenswrapper[4786]: I1002 06:47:11.948979 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:11Z","lastTransitionTime":"2025-10-02T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.050893 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.050913 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.050921 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.050930 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.050937 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:12Z","lastTransitionTime":"2025-10-02T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.152514 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.152559 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.152568 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.152578 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.152584 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:12Z","lastTransitionTime":"2025-10-02T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.178438 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.178458 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.178469 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.178521 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:12 crc kubenswrapper[4786]: E1002 06:47:12.178519 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:12 crc kubenswrapper[4786]: E1002 06:47:12.178591 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:12 crc kubenswrapper[4786]: E1002 06:47:12.178643 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:12 crc kubenswrapper[4786]: E1002 06:47:12.178675 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.254568 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.254615 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.254623 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.254633 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.254641 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:12Z","lastTransitionTime":"2025-10-02T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.355916 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.355944 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.355954 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.355963 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.355969 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:12Z","lastTransitionTime":"2025-10-02T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.456934 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.456962 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.456971 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.456982 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.456989 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:12Z","lastTransitionTime":"2025-10-02T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.558578 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.558602 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.558610 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.558619 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.558627 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:12Z","lastTransitionTime":"2025-10-02T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.659873 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.659911 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.659919 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.659929 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.659935 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:12Z","lastTransitionTime":"2025-10-02T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.761336 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.761358 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.761365 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.761374 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.761380 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:12Z","lastTransitionTime":"2025-10-02T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.863165 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.863189 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.863196 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.863205 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.863211 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:12Z","lastTransitionTime":"2025-10-02T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.965345 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.965376 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.965385 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.965397 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:12 crc kubenswrapper[4786]: I1002 06:47:12.965411 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:12Z","lastTransitionTime":"2025-10-02T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.067841 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.067946 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.068017 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.068084 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.068154 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:13Z","lastTransitionTime":"2025-10-02T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.170481 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.170623 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.170748 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.170850 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.171027 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:13Z","lastTransitionTime":"2025-10-02T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.272631 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.272654 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.272663 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.272672 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.272682 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:13Z","lastTransitionTime":"2025-10-02T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.374303 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.374350 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.374360 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.374380 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.374393 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:13Z","lastTransitionTime":"2025-10-02T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.476079 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.476112 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.476125 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.476139 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.476148 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:13Z","lastTransitionTime":"2025-10-02T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.577557 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.577586 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.577594 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.577607 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.577617 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:13Z","lastTransitionTime":"2025-10-02T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.679518 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.679538 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.679545 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.679559 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.679567 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:13Z","lastTransitionTime":"2025-10-02T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.781364 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.781407 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.781418 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.781436 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.781449 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:13Z","lastTransitionTime":"2025-10-02T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.883408 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.883544 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.883604 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.883678 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.883780 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:13Z","lastTransitionTime":"2025-10-02T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.985883 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.985917 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.985928 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.985944 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:13 crc kubenswrapper[4786]: I1002 06:47:13.985952 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:13Z","lastTransitionTime":"2025-10-02T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.087684 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.087742 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.087750 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.087764 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.087772 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:14Z","lastTransitionTime":"2025-10-02T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.178193 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.178232 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:14 crc kubenswrapper[4786]: E1002 06:47:14.178282 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.178301 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.178331 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:14 crc kubenswrapper[4786]: E1002 06:47:14.178452 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:14 crc kubenswrapper[4786]: E1002 06:47:14.178560 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:14 crc kubenswrapper[4786]: E1002 06:47:14.178654 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.189396 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.189427 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.189439 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.189454 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.189466 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:14Z","lastTransitionTime":"2025-10-02T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.291785 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.291826 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.291834 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.291848 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.291857 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:14Z","lastTransitionTime":"2025-10-02T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.393713 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.393754 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.393763 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.393779 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.393790 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:14Z","lastTransitionTime":"2025-10-02T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.495496 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.495529 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.495538 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.495552 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.495561 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:14Z","lastTransitionTime":"2025-10-02T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.597298 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.597349 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.597358 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.597379 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.597389 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:14Z","lastTransitionTime":"2025-10-02T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.699167 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.699196 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.699204 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.699217 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.699224 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:14Z","lastTransitionTime":"2025-10-02T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.801221 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.801245 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.801253 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.801265 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.801274 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:14Z","lastTransitionTime":"2025-10-02T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.903011 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.903047 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.903057 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.903070 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:14 crc kubenswrapper[4786]: I1002 06:47:14.903081 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:14Z","lastTransitionTime":"2025-10-02T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.005128 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.005156 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.005163 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.005174 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.005182 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:15Z","lastTransitionTime":"2025-10-02T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.106661 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.106715 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.106725 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.106733 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.106740 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:15Z","lastTransitionTime":"2025-10-02T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.209302 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.209341 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.209356 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.209370 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.209380 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:15Z","lastTransitionTime":"2025-10-02T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.312612 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.312678 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.312731 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.312746 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.312756 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:15Z","lastTransitionTime":"2025-10-02T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.415359 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.415395 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.415403 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.415418 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.415425 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:15Z","lastTransitionTime":"2025-10-02T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.517752 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.517806 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.517816 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.517829 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.517837 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:15Z","lastTransitionTime":"2025-10-02T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.619992 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.620071 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.620085 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.620097 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.620369 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:15Z","lastTransitionTime":"2025-10-02T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.721803 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.721847 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.721856 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.721868 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.721877 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:15Z","lastTransitionTime":"2025-10-02T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.823604 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.823646 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.823655 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.823669 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.823703 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:15Z","lastTransitionTime":"2025-10-02T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.925475 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.925556 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.925567 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.925579 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:15 crc kubenswrapper[4786]: I1002 06:47:15.925587 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:15Z","lastTransitionTime":"2025-10-02T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.027205 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.027239 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.027248 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.027260 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.027269 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:16Z","lastTransitionTime":"2025-10-02T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.128337 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.128370 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.128380 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.128392 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.128401 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:16Z","lastTransitionTime":"2025-10-02T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.178481 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.178491 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:16 crc kubenswrapper[4786]: E1002 06:47:16.178572 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.178597 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.178491 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:16 crc kubenswrapper[4786]: E1002 06:47:16.178642 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:16 crc kubenswrapper[4786]: E1002 06:47:16.178738 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:16 crc kubenswrapper[4786]: E1002 06:47:16.178800 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.229759 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.229808 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.229817 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.229827 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.229835 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:16Z","lastTransitionTime":"2025-10-02T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.331239 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.331270 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.331296 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.331307 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.331314 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:16Z","lastTransitionTime":"2025-10-02T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.432685 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.432727 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.432736 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.432746 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.432753 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:16Z","lastTransitionTime":"2025-10-02T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.534352 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.534384 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.534391 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.534400 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.534408 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:16Z","lastTransitionTime":"2025-10-02T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.636582 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.636609 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.636617 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.636626 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.636632 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:16Z","lastTransitionTime":"2025-10-02T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.737957 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.737983 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.737991 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.738003 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.738010 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:16Z","lastTransitionTime":"2025-10-02T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.752455 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs\") pod \"network-metrics-daemon-p8zkp\" (UID: \"6e4217c0-9581-4727-b594-adb99293f7db\") " pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:16 crc kubenswrapper[4786]: E1002 06:47:16.752554 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 06:47:16 crc kubenswrapper[4786]: E1002 06:47:16.752630 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs podName:6e4217c0-9581-4727-b594-adb99293f7db nodeName:}" failed. No retries permitted until 2025-10-02 06:47:32.752605101 +0000 UTC m=+62.873788242 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs") pod "network-metrics-daemon-p8zkp" (UID: "6e4217c0-9581-4727-b594-adb99293f7db") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.839704 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.839727 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.839735 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.839746 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.839753 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:16Z","lastTransitionTime":"2025-10-02T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.940971 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.941006 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.941016 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.941030 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:16 crc kubenswrapper[4786]: I1002 06:47:16.941038 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:16Z","lastTransitionTime":"2025-10-02T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.042617 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.042657 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.042681 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.042713 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.042724 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:17Z","lastTransitionTime":"2025-10-02T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.144575 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.144609 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.144619 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.144630 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.144640 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:17Z","lastTransitionTime":"2025-10-02T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.246474 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.246521 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.246532 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.246546 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.246555 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:17Z","lastTransitionTime":"2025-10-02T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.348502 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.348572 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.348583 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.348595 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.348604 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:17Z","lastTransitionTime":"2025-10-02T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.450505 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.450533 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.450540 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.450551 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.450558 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:17Z","lastTransitionTime":"2025-10-02T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.552418 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.552472 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.552481 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.552491 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.552499 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:17Z","lastTransitionTime":"2025-10-02T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.653721 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.653747 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.653755 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.653764 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.653771 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:17Z","lastTransitionTime":"2025-10-02T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.755717 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.755766 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.755777 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.755789 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.755797 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:17Z","lastTransitionTime":"2025-10-02T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.857224 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.857260 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.857268 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.857280 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.857288 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:17Z","lastTransitionTime":"2025-10-02T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.959232 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.959259 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.959267 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.959277 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:17 crc kubenswrapper[4786]: I1002 06:47:17.959284 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:17Z","lastTransitionTime":"2025-10-02T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.060341 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.060370 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.060393 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.060405 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.060413 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:18Z","lastTransitionTime":"2025-10-02T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.161490 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.161521 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.161532 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.161543 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.161550 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:18Z","lastTransitionTime":"2025-10-02T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.179100 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.179173 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.179397 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.179413 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:18 crc kubenswrapper[4786]: E1002 06:47:18.179503 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:18 crc kubenswrapper[4786]: E1002 06:47:18.179544 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.179593 4786 scope.go:117] "RemoveContainer" containerID="c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a" Oct 02 06:47:18 crc kubenswrapper[4786]: E1002 06:47:18.179604 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:18 crc kubenswrapper[4786]: E1002 06:47:18.179663 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.263174 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.263400 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.263409 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.263423 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.263431 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:18Z","lastTransitionTime":"2025-10-02T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.364460 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.364485 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.364492 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.364504 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.364513 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:18Z","lastTransitionTime":"2025-10-02T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.365023 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bgs8z_894eab78-90cf-4975-aa45-223332e04f5c/ovnkube-controller/1.log" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.366614 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerStarted","Data":"f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089"} Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.367114 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.377391 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:18Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.395841 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:46:56Z\\\",\\\"message\\\":\\\"34] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI1002 06:46:56.885514 6178 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nF1002 06:46:56.885521 6178 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 06:46:56.885524 6178 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]stri\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:18Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.404189 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:18Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.416232 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:18Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.429718 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:18Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.439216 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:18Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.455533 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:18Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.463185 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e4217c0-9581-4727-b594-adb99293f7db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8zkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:18Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.466041 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.466076 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.466085 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.466098 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.466107 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:18Z","lastTransitionTime":"2025-10-02T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.472321 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:18Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.480338 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:18Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.490799 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:18Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.498111 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:18Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.506731 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:18Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.515915 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:18Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.525153 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:18Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.531745 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h5tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d5f210-1181-43ab-b65a-12ba1f5a9255\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9611578bcef190486262cb116fbd569bedfb39989f1ff19839e06044c4d5558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9mj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h5tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:18Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.544817 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:18Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.568273 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.568303 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.568313 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.568324 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.568332 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:18Z","lastTransitionTime":"2025-10-02T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.669913 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.669943 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.669952 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.669965 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.669973 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:18Z","lastTransitionTime":"2025-10-02T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.771630 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.771667 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.771677 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.771704 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.771713 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:18Z","lastTransitionTime":"2025-10-02T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.873599 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.873626 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.873642 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.873675 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.873683 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:18Z","lastTransitionTime":"2025-10-02T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.975546 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.975569 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.975577 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.975586 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:18 crc kubenswrapper[4786]: I1002 06:47:18.975594 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:18Z","lastTransitionTime":"2025-10-02T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.077130 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.077157 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.077189 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.077200 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.077207 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:19Z","lastTransitionTime":"2025-10-02T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.178679 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.178719 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.178740 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.178750 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.178757 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:19Z","lastTransitionTime":"2025-10-02T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.221392 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.227836 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.230324 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.242177 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:46:56Z\\\",\\\"message\\\":\\\"34] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI1002 06:46:56.885514 6178 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nF1002 06:46:56.885521 6178 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 06:46:56.885524 6178 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]stri\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.249188 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.258275 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.265824 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e4217c0-9581-4727-b594-adb99293f7db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8zkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.273513 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.281036 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.281062 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.281071 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.281082 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.281091 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:19Z","lastTransitionTime":"2025-10-02T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.281722 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.288614 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.294943 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.303055 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.310917 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.318461 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.326363 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.338368 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.346063 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.353234 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.359549 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h5tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d5f210-1181-43ab-b65a-12ba1f5a9255\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9611578bcef190486262cb116fbd569bedfb39989f1ff19839e06044c4d5558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9mj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h5tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.369744 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bgs8z_894eab78-90cf-4975-aa45-223332e04f5c/ovnkube-controller/2.log" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.370185 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bgs8z_894eab78-90cf-4975-aa45-223332e04f5c/ovnkube-controller/1.log" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.372114 4786 generic.go:334] "Generic (PLEG): container finished" podID="894eab78-90cf-4975-aa45-223332e04f5c" containerID="f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089" exitCode=1 Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.372193 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerDied","Data":"f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089"} Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.372238 4786 scope.go:117] "RemoveContainer" containerID="c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.372723 4786 scope.go:117] "RemoveContainer" containerID="f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089" Oct 02 06:47:19 crc kubenswrapper[4786]: E1002 06:47:19.372857 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" podUID="894eab78-90cf-4975-aa45-223332e04f5c" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.380776 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.382763 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.382798 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.382807 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.382820 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.382828 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:19Z","lastTransitionTime":"2025-10-02T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.388432 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.396878 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.406817 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.413866 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.419874 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h5tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d5f210-1181-43ab-b65a-12ba1f5a9255\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9611578bcef190486262cb116fbd569bedfb39989f1ff19839e06044c4d5558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9mj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h5tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.431642 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.439137 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.446308 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.452984 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.460623 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.472155 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:46:56Z\\\",\\\"message\\\":\\\"34] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI1002 06:46:56.885514 6178 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nF1002 06:46:56.885521 6178 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 06:46:56.885524 6178 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]stri\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:18Z\\\",\\\"message\\\":\\\"terLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 06:47:18.774900 6487 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-controller-manager-operator/metrics]} name:Service_openshift-controller-manager-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDN\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.479162 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.484533 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.484561 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.484570 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.484582 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.484590 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:19Z","lastTransitionTime":"2025-10-02T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.487733 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.494153 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e4217c0-9581-4727-b594-adb99293f7db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8zkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.501479 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21cf44da-d50a-47e3-882b-9fb2e9532d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db07511409f2eb1f1e135e39813df7766557cb3d8c3b62f047fc7bab3ef51f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7363758d8415da5f7895b0b5366cf40f12e700913f3587f2d094f212f5a0c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ffa594e8f05a960db6e2a76483d1a5bba96a51ae8d28b378b044af07cabcdb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.508939 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.516805 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:19Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.586532 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.586577 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.586587 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.586600 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.586608 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:19Z","lastTransitionTime":"2025-10-02T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.688539 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.688567 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.688578 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.688588 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.688597 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:19Z","lastTransitionTime":"2025-10-02T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.790561 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.790593 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.790602 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.790613 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.790621 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:19Z","lastTransitionTime":"2025-10-02T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.892433 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.892455 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.892463 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.892472 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.892478 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:19Z","lastTransitionTime":"2025-10-02T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.977841 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.977894 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:19 crc kubenswrapper[4786]: E1002 06:47:19.977934 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:47:51.977922543 +0000 UTC m=+82.099105674 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.977956 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.977988 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:19 crc kubenswrapper[4786]: E1002 06:47:19.978029 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 06:47:19 crc kubenswrapper[4786]: E1002 06:47:19.978084 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 06:47:51.978072754 +0000 UTC m=+82.099255885 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 06:47:19 crc kubenswrapper[4786]: E1002 06:47:19.978090 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 06:47:19 crc kubenswrapper[4786]: E1002 06:47:19.978114 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 06:47:19 crc kubenswrapper[4786]: E1002 06:47:19.978123 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:47:19 crc kubenswrapper[4786]: E1002 06:47:19.978034 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 06:47:19 crc kubenswrapper[4786]: E1002 06:47:19.978165 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 06:47:51.978158254 +0000 UTC m=+82.099341386 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 06:47:19 crc kubenswrapper[4786]: E1002 06:47:19.978179 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 06:47:51.978170777 +0000 UTC m=+82.099353909 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.993658 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.993710 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.993721 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.993734 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:19 crc kubenswrapper[4786]: I1002 06:47:19.993743 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:19Z","lastTransitionTime":"2025-10-02T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.078852 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:20 crc kubenswrapper[4786]: E1002 06:47:20.078959 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 06:47:20 crc kubenswrapper[4786]: E1002 06:47:20.078972 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 06:47:20 crc kubenswrapper[4786]: E1002 06:47:20.078981 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:47:20 crc kubenswrapper[4786]: E1002 06:47:20.079012 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 06:47:52.07900497 +0000 UTC m=+82.200188101 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.095855 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.095888 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.095908 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.095919 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.095929 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:20Z","lastTransitionTime":"2025-10-02T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.178212 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.178249 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.178286 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:20 crc kubenswrapper[4786]: E1002 06:47:20.178382 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.178403 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:20 crc kubenswrapper[4786]: E1002 06:47:20.178465 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:20 crc kubenswrapper[4786]: E1002 06:47:20.178537 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:20 crc kubenswrapper[4786]: E1002 06:47:20.178594 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.187278 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.195437 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.197304 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.197326 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.197334 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.197345 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.197353 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:20Z","lastTransitionTime":"2025-10-02T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.203148 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.212401 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.218857 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e4217c0-9581-4727-b594-adb99293f7db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8zkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.225911 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21cf44da-d50a-47e3-882b-9fb2e9532d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db07511409f2eb1f1e135e39813df7766557cb3d8c3b62f047fc7bab3ef51f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7363758d8415da5f7895b0b5366cf40f12e700913f3587f2d094f212f5a0c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ffa594e8f05a960db6e2a76483d1a5bba96a51ae8d28b378b044af07cabcdb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.233779 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.241101 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.249431 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.259233 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.269842 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.277829 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.285100 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.291728 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h5tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d5f210-1181-43ab-b65a-12ba1f5a9255\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9611578bcef190486262cb116fbd569bedfb39989f1ff19839e06044c4d5558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9mj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h5tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.299221 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.299248 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.299256 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.299269 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.299278 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:20Z","lastTransitionTime":"2025-10-02T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.303770 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.312140 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.323681 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c06abd4488abe0505d99509ed4f132271235d56339fe22914096c11fd112ce7a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:46:56Z\\\",\\\"message\\\":\\\"34] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI1002 06:46:56.885514 6178 lb_config.go:1031] Cluster endpoints for openshift-etcd/etcd for network=default are: map[]\\\\nF1002 06:46:56.885521 6178 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:46:56Z is after 2025-08-24T17:21:41Z]\\\\nI1002 06:46:56.885524 6178 services_controller.go:443] Built service openshift-etcd/etcd LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.253\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:2379, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]stri\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:18Z\\\",\\\"message\\\":\\\"terLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 06:47:18.774900 6487 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-controller-manager-operator/metrics]} name:Service_openshift-controller-manager-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDN\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:47:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.330683 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.374919 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bgs8z_894eab78-90cf-4975-aa45-223332e04f5c/ovnkube-controller/2.log" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.377250 4786 scope.go:117] "RemoveContainer" containerID="f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089" Oct 02 06:47:20 crc kubenswrapper[4786]: E1002 06:47:20.377408 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" podUID="894eab78-90cf-4975-aa45-223332e04f5c" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.386444 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.394145 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.400448 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.400473 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.400481 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.400492 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.400500 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:20Z","lastTransitionTime":"2025-10-02T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.401149 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.409201 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.415743 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.427428 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.434881 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.442142 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.449147 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h5tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d5f210-1181-43ab-b65a-12ba1f5a9255\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9611578bcef190486262cb116fbd569bedfb39989f1ff19839e06044c4d5558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9mj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h5tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.456867 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.469369 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:18Z\\\",\\\"message\\\":\\\"terLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 06:47:18.774900 6487 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-controller-manager-operator/metrics]} name:Service_openshift-controller-manager-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDN\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:47:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.476508 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.483863 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21cf44da-d50a-47e3-882b-9fb2e9532d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db07511409f2eb1f1e135e39813df7766557cb3d8c3b62f047fc7bab3ef51f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7363758d8415da5f7895b0b5366cf40f12e700913f3587f2d094f212f5a0c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ffa594e8f05a960db6e2a76483d1a5bba96a51ae8d28b378b044af07cabcdb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.491366 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.499326 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.501727 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.501748 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.501757 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.501768 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.501777 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:20Z","lastTransitionTime":"2025-10-02T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.505980 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.514528 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.520975 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e4217c0-9581-4727-b594-adb99293f7db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8zkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:20Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.603106 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.603124 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.603132 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.603143 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.603150 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:20Z","lastTransitionTime":"2025-10-02T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.704493 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.704513 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.704521 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.704531 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.704538 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:20Z","lastTransitionTime":"2025-10-02T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.805855 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.805885 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.805895 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.805906 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.805915 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:20Z","lastTransitionTime":"2025-10-02T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.907943 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.907978 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.907988 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.908000 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:20 crc kubenswrapper[4786]: I1002 06:47:20.908009 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:20Z","lastTransitionTime":"2025-10-02T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.009835 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.009866 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.009875 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.009884 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.009891 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:21Z","lastTransitionTime":"2025-10-02T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.111603 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.111647 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.111657 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.111668 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.111676 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:21Z","lastTransitionTime":"2025-10-02T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.212778 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.212807 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.212816 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.212826 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.212834 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:21Z","lastTransitionTime":"2025-10-02T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.287033 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.287071 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.287083 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.287096 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.287105 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:21Z","lastTransitionTime":"2025-10-02T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:21 crc kubenswrapper[4786]: E1002 06:47:21.295553 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:21Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.298088 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.298121 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.298133 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.298146 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.298154 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:21Z","lastTransitionTime":"2025-10-02T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:21 crc kubenswrapper[4786]: E1002 06:47:21.306230 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:21Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.308206 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.308232 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.308241 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.308251 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.308258 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:21Z","lastTransitionTime":"2025-10-02T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:21 crc kubenswrapper[4786]: E1002 06:47:21.315432 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:21Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.317275 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.317300 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.317309 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.317319 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.317326 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:21Z","lastTransitionTime":"2025-10-02T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:21 crc kubenswrapper[4786]: E1002 06:47:21.324443 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:21Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.326442 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.326474 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.326483 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.326497 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.326505 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:21Z","lastTransitionTime":"2025-10-02T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:21 crc kubenswrapper[4786]: E1002 06:47:21.333975 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:21Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:21 crc kubenswrapper[4786]: E1002 06:47:21.334075 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.334860 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.334886 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.334894 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.334905 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.334913 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:21Z","lastTransitionTime":"2025-10-02T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.436269 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.436322 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.436331 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.436342 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.436350 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:21Z","lastTransitionTime":"2025-10-02T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.538148 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.538177 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.538186 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.538196 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.538203 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:21Z","lastTransitionTime":"2025-10-02T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.639651 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.639684 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.639717 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.639729 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.639736 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:21Z","lastTransitionTime":"2025-10-02T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.741374 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.741400 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.741409 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.741418 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.741425 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:21Z","lastTransitionTime":"2025-10-02T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.842812 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.842841 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.842849 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.842862 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.842869 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:21Z","lastTransitionTime":"2025-10-02T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.944671 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.944721 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.944730 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.944739 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:21 crc kubenswrapper[4786]: I1002 06:47:21.944746 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:21Z","lastTransitionTime":"2025-10-02T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.046711 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.046734 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.046742 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.046751 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.046760 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:22Z","lastTransitionTime":"2025-10-02T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.148654 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.148679 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.148701 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.148713 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.148722 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:22Z","lastTransitionTime":"2025-10-02T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.178577 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.178653 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:22 crc kubenswrapper[4786]: E1002 06:47:22.178675 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.178705 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.178589 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:22 crc kubenswrapper[4786]: E1002 06:47:22.178817 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:22 crc kubenswrapper[4786]: E1002 06:47:22.178849 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:22 crc kubenswrapper[4786]: E1002 06:47:22.178921 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.250737 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.250765 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.250774 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.250783 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.250789 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:22Z","lastTransitionTime":"2025-10-02T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.352913 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.352943 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.352953 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.352962 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.352970 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:22Z","lastTransitionTime":"2025-10-02T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.454915 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.454984 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.455023 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.455046 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.455061 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:22Z","lastTransitionTime":"2025-10-02T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.557352 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.557424 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.557440 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.557465 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.557479 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:22Z","lastTransitionTime":"2025-10-02T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.659540 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.659578 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.659603 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.659618 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.659634 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:22Z","lastTransitionTime":"2025-10-02T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.760874 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.760899 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.760908 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.760918 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.760926 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:22Z","lastTransitionTime":"2025-10-02T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.863043 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.863100 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.863118 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.863136 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.863149 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:22Z","lastTransitionTime":"2025-10-02T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.964544 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.964579 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.964598 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.964612 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:22 crc kubenswrapper[4786]: I1002 06:47:22.964619 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:22Z","lastTransitionTime":"2025-10-02T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.066396 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.066428 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.066442 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.066461 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.066473 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:23Z","lastTransitionTime":"2025-10-02T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.167883 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.167916 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.167925 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.167936 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.167944 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:23Z","lastTransitionTime":"2025-10-02T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.268832 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.268861 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.268869 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.268880 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.268888 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:23Z","lastTransitionTime":"2025-10-02T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.371160 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.371191 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.371200 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.371213 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.371223 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:23Z","lastTransitionTime":"2025-10-02T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.472596 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.472737 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.472751 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.472763 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.472770 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:23Z","lastTransitionTime":"2025-10-02T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.574221 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.574252 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.574262 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.574275 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.574282 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:23Z","lastTransitionTime":"2025-10-02T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.676282 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.676319 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.676334 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.676350 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.676361 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:23Z","lastTransitionTime":"2025-10-02T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.778336 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.778440 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.778529 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.778619 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.778714 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:23Z","lastTransitionTime":"2025-10-02T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.879976 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.880001 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.880010 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.880020 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.880029 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:23Z","lastTransitionTime":"2025-10-02T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.982339 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.982377 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.982386 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.982400 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:23 crc kubenswrapper[4786]: I1002 06:47:23.982410 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:23Z","lastTransitionTime":"2025-10-02T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.084440 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.084513 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.084536 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.084552 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.084562 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:24Z","lastTransitionTime":"2025-10-02T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.179088 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:24 crc kubenswrapper[4786]: E1002 06:47:24.179183 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.179238 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:24 crc kubenswrapper[4786]: E1002 06:47:24.179277 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.179540 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:24 crc kubenswrapper[4786]: E1002 06:47:24.179593 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.179735 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:24 crc kubenswrapper[4786]: E1002 06:47:24.179783 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.186872 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.187126 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.187136 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.187149 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.187157 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:24Z","lastTransitionTime":"2025-10-02T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.289315 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.289350 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.289361 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.289372 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.289382 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:24Z","lastTransitionTime":"2025-10-02T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.390629 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.390658 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.390666 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.390676 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.390683 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:24Z","lastTransitionTime":"2025-10-02T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.492671 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.492715 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.492724 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.492734 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.492742 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:24Z","lastTransitionTime":"2025-10-02T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.594888 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.594934 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.594945 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.594958 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.594969 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:24Z","lastTransitionTime":"2025-10-02T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.695957 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.695989 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.696000 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.696011 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.696019 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:24Z","lastTransitionTime":"2025-10-02T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.797981 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.798019 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.798028 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.798037 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.798044 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:24Z","lastTransitionTime":"2025-10-02T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.899929 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.899962 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.899974 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.899985 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:24 crc kubenswrapper[4786]: I1002 06:47:24.899994 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:24Z","lastTransitionTime":"2025-10-02T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.001507 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.001529 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.001537 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.001546 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.001562 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:25Z","lastTransitionTime":"2025-10-02T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.102640 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.102663 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.102671 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.102680 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.102710 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:25Z","lastTransitionTime":"2025-10-02T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.204309 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.204329 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.204336 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.204345 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.204352 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:25Z","lastTransitionTime":"2025-10-02T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.306302 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.306329 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.306337 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.306346 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.306353 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:25Z","lastTransitionTime":"2025-10-02T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.408856 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.408893 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.408902 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.408919 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.408927 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:25Z","lastTransitionTime":"2025-10-02T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.510488 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.510509 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.510518 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.510526 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.510533 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:25Z","lastTransitionTime":"2025-10-02T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.612119 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.612143 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.612151 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.612160 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.612168 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:25Z","lastTransitionTime":"2025-10-02T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.713956 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.713986 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.713994 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.714003 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.714009 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:25Z","lastTransitionTime":"2025-10-02T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.815723 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.815749 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.815757 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.815767 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.815793 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:25Z","lastTransitionTime":"2025-10-02T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.917334 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.917359 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.917367 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.917379 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:25 crc kubenswrapper[4786]: I1002 06:47:25.917386 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:25Z","lastTransitionTime":"2025-10-02T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.019125 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.019154 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.019166 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.019177 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.019185 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:26Z","lastTransitionTime":"2025-10-02T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.121203 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.121237 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.121246 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.121258 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.121267 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:26Z","lastTransitionTime":"2025-10-02T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.178427 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.178443 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.178483 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:26 crc kubenswrapper[4786]: E1002 06:47:26.178534 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.178592 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:26 crc kubenswrapper[4786]: E1002 06:47:26.178748 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:26 crc kubenswrapper[4786]: E1002 06:47:26.178789 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:26 crc kubenswrapper[4786]: E1002 06:47:26.178838 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.223503 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.223531 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.223549 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.223561 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.223569 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:26Z","lastTransitionTime":"2025-10-02T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.324995 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.325017 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.325024 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.325032 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.325039 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:26Z","lastTransitionTime":"2025-10-02T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.426516 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.426590 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.426601 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.426611 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.426618 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:26Z","lastTransitionTime":"2025-10-02T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.528050 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.528076 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.528086 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.528096 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.528103 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:26Z","lastTransitionTime":"2025-10-02T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.629825 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.629853 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.629862 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.629874 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.629884 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:26Z","lastTransitionTime":"2025-10-02T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.731512 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.731544 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.731552 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.731561 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.731569 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:26Z","lastTransitionTime":"2025-10-02T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.833585 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.833611 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.833620 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.833632 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.833641 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:26Z","lastTransitionTime":"2025-10-02T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.935358 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.935380 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.935388 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.935398 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:26 crc kubenswrapper[4786]: I1002 06:47:26.935405 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:26Z","lastTransitionTime":"2025-10-02T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.037614 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.037651 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.037661 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.037679 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.037706 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:27Z","lastTransitionTime":"2025-10-02T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.139213 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.139247 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.139257 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.139271 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.139279 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:27Z","lastTransitionTime":"2025-10-02T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.241357 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.241374 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.241383 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.241394 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.241401 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:27Z","lastTransitionTime":"2025-10-02T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.343156 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.343191 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.343200 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.343214 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.343221 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:27Z","lastTransitionTime":"2025-10-02T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.444809 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.444832 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.444840 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.444850 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.444858 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:27Z","lastTransitionTime":"2025-10-02T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.545866 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.545891 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.545902 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.545912 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.545919 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:27Z","lastTransitionTime":"2025-10-02T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.647874 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.647893 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.647900 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.647909 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.647916 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:27Z","lastTransitionTime":"2025-10-02T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.749761 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.749813 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.749822 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.749835 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.749844 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:27Z","lastTransitionTime":"2025-10-02T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.851251 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.851280 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.851288 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.851298 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.851306 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:27Z","lastTransitionTime":"2025-10-02T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.952768 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.952797 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.952806 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.952819 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:27 crc kubenswrapper[4786]: I1002 06:47:27.952828 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:27Z","lastTransitionTime":"2025-10-02T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.054281 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.054310 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.054324 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.054334 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.054341 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:28Z","lastTransitionTime":"2025-10-02T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.156506 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.156548 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.156556 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.156568 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.156577 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:28Z","lastTransitionTime":"2025-10-02T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.179049 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.179049 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.179053 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.179083 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:28 crc kubenswrapper[4786]: E1002 06:47:28.179179 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:28 crc kubenswrapper[4786]: E1002 06:47:28.179258 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:28 crc kubenswrapper[4786]: E1002 06:47:28.179333 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:28 crc kubenswrapper[4786]: E1002 06:47:28.179411 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.258719 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.258756 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.258764 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.258780 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.258789 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:28Z","lastTransitionTime":"2025-10-02T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.360254 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.360282 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.360290 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.360301 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.360309 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:28Z","lastTransitionTime":"2025-10-02T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.462142 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.462198 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.462208 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.462220 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.462231 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:28Z","lastTransitionTime":"2025-10-02T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.563901 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.563942 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.563950 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.563965 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.563974 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:28Z","lastTransitionTime":"2025-10-02T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.666136 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.666168 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.666178 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.666189 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.666196 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:28Z","lastTransitionTime":"2025-10-02T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.767750 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.767813 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.767825 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.767836 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.767844 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:28Z","lastTransitionTime":"2025-10-02T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.870032 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.870064 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.870073 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.870086 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.870095 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:28Z","lastTransitionTime":"2025-10-02T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.972017 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.972048 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.972057 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.972071 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:28 crc kubenswrapper[4786]: I1002 06:47:28.972079 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:28Z","lastTransitionTime":"2025-10-02T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.073655 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.073681 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.073710 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.073724 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.073733 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:29Z","lastTransitionTime":"2025-10-02T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.175029 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.175073 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.175084 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.175094 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.175101 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:29Z","lastTransitionTime":"2025-10-02T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.276675 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.276715 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.276724 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.276732 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.276739 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:29Z","lastTransitionTime":"2025-10-02T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.377957 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.377981 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.377990 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.378000 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.378009 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:29Z","lastTransitionTime":"2025-10-02T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.479429 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.479453 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.479460 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.479468 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.479476 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:29Z","lastTransitionTime":"2025-10-02T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.580785 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.580812 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.580821 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.580832 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.580840 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:29Z","lastTransitionTime":"2025-10-02T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.682096 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.682123 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.682130 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.682139 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.682147 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:29Z","lastTransitionTime":"2025-10-02T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.784233 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.784262 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.784271 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.784283 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.784291 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:29Z","lastTransitionTime":"2025-10-02T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.885776 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.885813 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.885822 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.885834 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.885843 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:29Z","lastTransitionTime":"2025-10-02T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.987405 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.987448 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.987457 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.987467 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:29 crc kubenswrapper[4786]: I1002 06:47:29.987473 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:29Z","lastTransitionTime":"2025-10-02T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.088771 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.088819 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.088828 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.088860 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.088868 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:30Z","lastTransitionTime":"2025-10-02T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.178820 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.178882 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:30 crc kubenswrapper[4786]: E1002 06:47:30.178911 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.178923 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.178959 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:30 crc kubenswrapper[4786]: E1002 06:47:30.179065 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:30 crc kubenswrapper[4786]: E1002 06:47:30.179186 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:30 crc kubenswrapper[4786]: E1002 06:47:30.179261 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.190730 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.190760 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.190768 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.190779 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.190787 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:30Z","lastTransitionTime":"2025-10-02T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.191246 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:30Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.198789 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:30Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.205944 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:30Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.212475 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h5tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d5f210-1181-43ab-b65a-12ba1f5a9255\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9611578bcef190486262cb116fbd569bedfb39989f1ff19839e06044c4d5558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9mj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h5tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:30Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.220057 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:30Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.231233 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:18Z\\\",\\\"message\\\":\\\"terLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 06:47:18.774900 6487 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-controller-manager-operator/metrics]} name:Service_openshift-controller-manager-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDN\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:47:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:30Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.239370 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:30Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.248508 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:30Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.256080 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e4217c0-9581-4727-b594-adb99293f7db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8zkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:30Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.263593 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21cf44da-d50a-47e3-882b-9fb2e9532d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db07511409f2eb1f1e135e39813df7766557cb3d8c3b62f047fc7bab3ef51f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7363758d8415da5f7895b0b5366cf40f12e700913f3587f2d094f212f5a0c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ffa594e8f05a960db6e2a76483d1a5bba96a51ae8d28b378b044af07cabcdb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:30Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.271236 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:30Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.279275 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:30Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.286321 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:30Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.292339 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.292369 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.292378 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.292390 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.292398 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:30Z","lastTransitionTime":"2025-10-02T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.292431 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:30Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.301151 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:30Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.308861 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:30Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.315766 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:30Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.323103 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:30Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.393378 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.393401 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.393410 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.393422 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.393430 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:30Z","lastTransitionTime":"2025-10-02T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.495471 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.495511 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.495519 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.495529 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.495537 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:30Z","lastTransitionTime":"2025-10-02T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.597360 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.597390 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.597399 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.597412 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.597420 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:30Z","lastTransitionTime":"2025-10-02T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.699080 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.699131 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.699140 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.699153 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.699162 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:30Z","lastTransitionTime":"2025-10-02T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.800983 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.801010 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.801018 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.801030 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.801038 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:30Z","lastTransitionTime":"2025-10-02T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.903185 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.903214 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.903225 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.903240 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:30 crc kubenswrapper[4786]: I1002 06:47:30.903249 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:30Z","lastTransitionTime":"2025-10-02T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.004652 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.004702 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.004712 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.004724 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.004733 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:31Z","lastTransitionTime":"2025-10-02T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.106932 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.106989 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.106999 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.107012 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.107021 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:31Z","lastTransitionTime":"2025-10-02T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.208542 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.208580 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.208589 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.208605 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.208614 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:31Z","lastTransitionTime":"2025-10-02T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.310334 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.310387 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.310396 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.310408 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.310416 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:31Z","lastTransitionTime":"2025-10-02T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.411496 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.411524 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.411532 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.411543 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.411551 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:31Z","lastTransitionTime":"2025-10-02T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.513445 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.513488 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.513498 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.513508 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.513519 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:31Z","lastTransitionTime":"2025-10-02T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.615157 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.615190 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.615200 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.615214 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.615222 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:31Z","lastTransitionTime":"2025-10-02T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.684879 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.684902 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.684911 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.684922 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.684929 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:31Z","lastTransitionTime":"2025-10-02T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:31 crc kubenswrapper[4786]: E1002 06:47:31.692964 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:31Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.695130 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.695155 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.695164 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.695174 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.695181 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:31Z","lastTransitionTime":"2025-10-02T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:31 crc kubenswrapper[4786]: E1002 06:47:31.702643 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:31Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.704517 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.704541 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.704550 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.704561 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.704571 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:31Z","lastTransitionTime":"2025-10-02T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:31 crc kubenswrapper[4786]: E1002 06:47:31.711599 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:31Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.713318 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.713416 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.713426 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.713435 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.713442 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:31Z","lastTransitionTime":"2025-10-02T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:31 crc kubenswrapper[4786]: E1002 06:47:31.720479 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:31Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.722195 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.722224 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.722232 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.722290 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.722299 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:31Z","lastTransitionTime":"2025-10-02T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:31 crc kubenswrapper[4786]: E1002 06:47:31.729318 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:31Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:31 crc kubenswrapper[4786]: E1002 06:47:31.729416 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.730253 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.730275 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.730283 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.730293 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.730300 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:31Z","lastTransitionTime":"2025-10-02T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.832095 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.832118 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.832126 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.832146 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.832153 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:31Z","lastTransitionTime":"2025-10-02T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.933451 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.933487 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.933495 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.933504 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:31 crc kubenswrapper[4786]: I1002 06:47:31.933511 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:31Z","lastTransitionTime":"2025-10-02T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.035520 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.035542 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.035550 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.035559 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.035566 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:32Z","lastTransitionTime":"2025-10-02T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.137272 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.137320 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.137334 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.137350 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.137365 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:32Z","lastTransitionTime":"2025-10-02T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.178396 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.178425 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.178505 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:32 crc kubenswrapper[4786]: E1002 06:47:32.178502 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.178523 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:32 crc kubenswrapper[4786]: E1002 06:47:32.178606 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:32 crc kubenswrapper[4786]: E1002 06:47:32.178621 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:32 crc kubenswrapper[4786]: E1002 06:47:32.178677 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.238945 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.238981 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.238989 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.239000 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.239008 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:32Z","lastTransitionTime":"2025-10-02T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.341148 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.341186 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.341195 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.341208 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.341218 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:32Z","lastTransitionTime":"2025-10-02T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.442568 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.442814 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.442825 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.442839 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.442847 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:32Z","lastTransitionTime":"2025-10-02T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.545086 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.545122 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.545130 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.545143 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.545152 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:32Z","lastTransitionTime":"2025-10-02T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.646506 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.646535 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.646543 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.646554 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.646562 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:32Z","lastTransitionTime":"2025-10-02T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.748673 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.748719 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.748728 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.748738 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.748746 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:32Z","lastTransitionTime":"2025-10-02T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.776114 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs\") pod \"network-metrics-daemon-p8zkp\" (UID: \"6e4217c0-9581-4727-b594-adb99293f7db\") " pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:32 crc kubenswrapper[4786]: E1002 06:47:32.776248 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 06:47:32 crc kubenswrapper[4786]: E1002 06:47:32.776303 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs podName:6e4217c0-9581-4727-b594-adb99293f7db nodeName:}" failed. No retries permitted until 2025-10-02 06:48:04.776289174 +0000 UTC m=+94.897472304 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs") pod "network-metrics-daemon-p8zkp" (UID: "6e4217c0-9581-4727-b594-adb99293f7db") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.850563 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.850591 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.850598 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.850608 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.850615 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:32Z","lastTransitionTime":"2025-10-02T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.951985 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.952011 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.952019 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.952030 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:32 crc kubenswrapper[4786]: I1002 06:47:32.952037 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:32Z","lastTransitionTime":"2025-10-02T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.053965 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.054001 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.054010 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.054024 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.054033 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:33Z","lastTransitionTime":"2025-10-02T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.155955 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.155983 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.155991 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.156005 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.156014 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:33Z","lastTransitionTime":"2025-10-02T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.258474 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.258508 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.258516 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.258530 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.258558 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:33Z","lastTransitionTime":"2025-10-02T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.360889 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.360925 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.360934 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.360946 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.360955 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:33Z","lastTransitionTime":"2025-10-02T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.463006 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.463039 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.463048 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.463060 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.463069 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:33Z","lastTransitionTime":"2025-10-02T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.564511 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.564548 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.564558 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.564572 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.564581 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:33Z","lastTransitionTime":"2025-10-02T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.666489 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.666518 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.666526 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.666538 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.666545 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:33Z","lastTransitionTime":"2025-10-02T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.768325 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.768351 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.768360 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.768369 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.768376 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:33Z","lastTransitionTime":"2025-10-02T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.870158 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.870178 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.870186 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.870202 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.870210 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:33Z","lastTransitionTime":"2025-10-02T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.971761 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.971810 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.971820 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.971834 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:33 crc kubenswrapper[4786]: I1002 06:47:33.971843 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:33Z","lastTransitionTime":"2025-10-02T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.073634 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.073658 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.073666 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.073704 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.073714 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:34Z","lastTransitionTime":"2025-10-02T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.175245 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.175267 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.175275 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.175286 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.175294 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:34Z","lastTransitionTime":"2025-10-02T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.178565 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.178586 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.178584 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.178618 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:34 crc kubenswrapper[4786]: E1002 06:47:34.178686 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:34 crc kubenswrapper[4786]: E1002 06:47:34.178760 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:34 crc kubenswrapper[4786]: E1002 06:47:34.178818 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:34 crc kubenswrapper[4786]: E1002 06:47:34.178871 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.276471 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.276497 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.276506 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.276516 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.276523 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:34Z","lastTransitionTime":"2025-10-02T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.378585 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.378783 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.378873 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.378944 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.379008 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:34Z","lastTransitionTime":"2025-10-02T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.404477 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7hgkl_de8dcd53-84d9-422e-8f18-63ea8ea75bd2/kube-multus/0.log" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.404522 4786 generic.go:334] "Generic (PLEG): container finished" podID="de8dcd53-84d9-422e-8f18-63ea8ea75bd2" containerID="5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015" exitCode=1 Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.404544 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7hgkl" event={"ID":"de8dcd53-84d9-422e-8f18-63ea8ea75bd2","Type":"ContainerDied","Data":"5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015"} Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.404809 4786 scope.go:117] "RemoveContainer" containerID="5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.414493 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:34Z\\\",\\\"message\\\":\\\"2025-10-02T06:46:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_70569a91-f43b-4f6d-80fa-a81393a68a6e\\\\n2025-10-02T06:46:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_70569a91-f43b-4f6d-80fa-a81393a68a6e to /host/opt/cni/bin/\\\\n2025-10-02T06:46:49Z [verbose] multus-daemon started\\\\n2025-10-02T06:46:49Z [verbose] Readiness Indicator file check\\\\n2025-10-02T06:47:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:34Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.427235 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:18Z\\\",\\\"message\\\":\\\"terLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 06:47:18.774900 6487 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-controller-manager-operator/metrics]} name:Service_openshift-controller-manager-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDN\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:47:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:34Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.434704 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:34Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.445097 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21cf44da-d50a-47e3-882b-9fb2e9532d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db07511409f2eb1f1e135e39813df7766557cb3d8c3b62f047fc7bab3ef51f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7363758d8415da5f7895b0b5366cf40f12e700913f3587f2d094f212f5a0c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ffa594e8f05a960db6e2a76483d1a5bba96a51ae8d28b378b044af07cabcdb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:34Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.453201 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:34Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.461857 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:34Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.477885 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:34Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.481307 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.481355 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.481365 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.481376 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.481385 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:34Z","lastTransitionTime":"2025-10-02T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.502938 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:34Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.510888 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e4217c0-9581-4727-b594-adb99293f7db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8zkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:34Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.520299 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:34Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.528519 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:34Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.536521 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:34Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.544300 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:34Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.550660 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:34Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.563148 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:34Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.571264 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:34Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.579416 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:34Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.583857 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.583887 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.583898 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.583910 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.583918 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:34Z","lastTransitionTime":"2025-10-02T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.585907 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h5tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d5f210-1181-43ab-b65a-12ba1f5a9255\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9611578bcef190486262cb116fbd569bedfb39989f1ff19839e06044c4d5558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9mj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h5tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:34Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.685990 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.686019 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.686028 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.686039 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.686047 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:34Z","lastTransitionTime":"2025-10-02T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.787553 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.787590 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.787600 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.787615 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.787625 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:34Z","lastTransitionTime":"2025-10-02T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.889185 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.889218 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.889227 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.889240 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.889251 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:34Z","lastTransitionTime":"2025-10-02T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.990983 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.991008 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.991016 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.991027 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:34 crc kubenswrapper[4786]: I1002 06:47:34.991035 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:34Z","lastTransitionTime":"2025-10-02T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.092919 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.092954 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.092963 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.092976 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.092985 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:35Z","lastTransitionTime":"2025-10-02T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.194487 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.194514 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.194522 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.194532 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.194539 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:35Z","lastTransitionTime":"2025-10-02T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.295722 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.295754 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.295762 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.295773 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.295780 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:35Z","lastTransitionTime":"2025-10-02T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.397030 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.397058 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.397066 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.397077 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.397084 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:35Z","lastTransitionTime":"2025-10-02T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.407920 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7hgkl_de8dcd53-84d9-422e-8f18-63ea8ea75bd2/kube-multus/0.log" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.407974 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7hgkl" event={"ID":"de8dcd53-84d9-422e-8f18-63ea8ea75bd2","Type":"ContainerStarted","Data":"8b5248bee859340a1496284f12f5f40320ec7bd7f0d299f47b29eed26204d6fc"} Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.418065 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5248bee859340a1496284f12f5f40320ec7bd7f0d299f47b29eed26204d6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:34Z\\\",\\\"message\\\":\\\"2025-10-02T06:46:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_70569a91-f43b-4f6d-80fa-a81393a68a6e\\\\n2025-10-02T06:46:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_70569a91-f43b-4f6d-80fa-a81393a68a6e to /host/opt/cni/bin/\\\\n2025-10-02T06:46:49Z [verbose] multus-daemon started\\\\n2025-10-02T06:46:49Z [verbose] Readiness Indicator file check\\\\n2025-10-02T06:47:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:35Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.430674 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:18Z\\\",\\\"message\\\":\\\"terLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 06:47:18.774900 6487 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-controller-manager-operator/metrics]} name:Service_openshift-controller-manager-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDN\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:47:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:35Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.437880 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:35Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.447161 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:35Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.453624 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e4217c0-9581-4727-b594-adb99293f7db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8zkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:35Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.460528 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21cf44da-d50a-47e3-882b-9fb2e9532d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db07511409f2eb1f1e135e39813df7766557cb3d8c3b62f047fc7bab3ef51f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7363758d8415da5f7895b0b5366cf40f12e700913f3587f2d094f212f5a0c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ffa594e8f05a960db6e2a76483d1a5bba96a51ae8d28b378b044af07cabcdb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:35Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.468980 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:35Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.476927 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:35Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.484297 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:35Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.490708 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:35Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.498030 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.498052 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.498060 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.498071 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.498079 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:35Z","lastTransitionTime":"2025-10-02T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.499252 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:35Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.507294 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:35Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.514515 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:35Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.522283 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:35Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.534772 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:35Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.542397 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:35Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.549824 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:35Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.558328 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h5tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d5f210-1181-43ab-b65a-12ba1f5a9255\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9611578bcef190486262cb116fbd569bedfb39989f1ff19839e06044c4d5558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9mj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h5tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:35Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.599850 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.599878 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.599888 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.599900 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.599909 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:35Z","lastTransitionTime":"2025-10-02T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.701534 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.701559 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.701568 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.701578 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.701587 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:35Z","lastTransitionTime":"2025-10-02T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.803879 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.803925 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.803936 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.803958 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.803969 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:35Z","lastTransitionTime":"2025-10-02T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.905618 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.905644 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.905652 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.905662 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:35 crc kubenswrapper[4786]: I1002 06:47:35.905670 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:35Z","lastTransitionTime":"2025-10-02T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.007868 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.007904 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.007912 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.007924 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.007933 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:36Z","lastTransitionTime":"2025-10-02T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.109565 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.109585 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.109593 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.109602 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.109609 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:36Z","lastTransitionTime":"2025-10-02T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.178990 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.179018 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:36 crc kubenswrapper[4786]: E1002 06:47:36.179073 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.179084 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.179105 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:36 crc kubenswrapper[4786]: E1002 06:47:36.179164 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:36 crc kubenswrapper[4786]: E1002 06:47:36.179216 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:36 crc kubenswrapper[4786]: E1002 06:47:36.179546 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.179685 4786 scope.go:117] "RemoveContainer" containerID="f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089" Oct 02 06:47:36 crc kubenswrapper[4786]: E1002 06:47:36.179855 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" podUID="894eab78-90cf-4975-aa45-223332e04f5c" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.211028 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.211059 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.211069 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.211080 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.211089 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:36Z","lastTransitionTime":"2025-10-02T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.312528 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.312551 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.312560 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.312572 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.312580 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:36Z","lastTransitionTime":"2025-10-02T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.414048 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.414076 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.414084 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.414093 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.414103 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:36Z","lastTransitionTime":"2025-10-02T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.515809 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.515846 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.515857 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.515871 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.515882 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:36Z","lastTransitionTime":"2025-10-02T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.617739 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.617801 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.617810 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.617819 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.617827 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:36Z","lastTransitionTime":"2025-10-02T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.719647 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.719671 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.719679 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.719709 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.719716 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:36Z","lastTransitionTime":"2025-10-02T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.821549 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.821656 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.821742 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.821812 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.821864 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:36Z","lastTransitionTime":"2025-10-02T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.923202 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.923225 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.923233 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.923243 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:36 crc kubenswrapper[4786]: I1002 06:47:36.923251 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:36Z","lastTransitionTime":"2025-10-02T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.024620 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.024645 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.024654 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.024663 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.024680 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:37Z","lastTransitionTime":"2025-10-02T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.125754 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.125775 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.125782 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.125794 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.125802 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:37Z","lastTransitionTime":"2025-10-02T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.226922 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.226960 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.226968 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.226982 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.226990 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:37Z","lastTransitionTime":"2025-10-02T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.328473 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.328504 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.328512 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.328522 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.328530 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:37Z","lastTransitionTime":"2025-10-02T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.430088 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.430132 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.430142 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.430154 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.430165 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:37Z","lastTransitionTime":"2025-10-02T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.531710 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.531739 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.531747 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.531757 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.531764 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:37Z","lastTransitionTime":"2025-10-02T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.633268 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.633490 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.633561 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.633628 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.633707 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:37Z","lastTransitionTime":"2025-10-02T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.735081 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.735107 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.735116 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.735127 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.735135 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:37Z","lastTransitionTime":"2025-10-02T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.836899 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.836927 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.836934 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.836944 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.836950 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:37Z","lastTransitionTime":"2025-10-02T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.938521 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.938548 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.938556 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.938565 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:37 crc kubenswrapper[4786]: I1002 06:47:37.938573 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:37Z","lastTransitionTime":"2025-10-02T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.040165 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.040234 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.040244 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.040253 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.040261 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:38Z","lastTransitionTime":"2025-10-02T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.142108 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.142434 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.142510 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.142597 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.142656 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:38Z","lastTransitionTime":"2025-10-02T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.178595 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:38 crc kubenswrapper[4786]: E1002 06:47:38.179239 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.178623 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:38 crc kubenswrapper[4786]: E1002 06:47:38.179348 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.178646 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:38 crc kubenswrapper[4786]: E1002 06:47:38.179433 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.178601 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:38 crc kubenswrapper[4786]: E1002 06:47:38.179813 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.244031 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.244075 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.244083 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.244092 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.244100 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:38Z","lastTransitionTime":"2025-10-02T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.345656 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.345712 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.345721 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.345731 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.345737 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:38Z","lastTransitionTime":"2025-10-02T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.447499 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.447527 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.447535 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.447545 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.447552 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:38Z","lastTransitionTime":"2025-10-02T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.548942 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.548984 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.548993 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.549003 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.549011 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:38Z","lastTransitionTime":"2025-10-02T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.650355 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.650395 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.650405 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.650417 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.650426 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:38Z","lastTransitionTime":"2025-10-02T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.751726 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.751751 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.751760 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.751771 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.751778 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:38Z","lastTransitionTime":"2025-10-02T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.853395 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.853415 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.853424 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.853434 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.853442 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:38Z","lastTransitionTime":"2025-10-02T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.954786 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.954807 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.954814 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.954823 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:38 crc kubenswrapper[4786]: I1002 06:47:38.954830 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:38Z","lastTransitionTime":"2025-10-02T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.056564 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.056634 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.056643 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.056651 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.056658 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:39Z","lastTransitionTime":"2025-10-02T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.158441 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.158470 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.158480 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.158493 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.158501 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:39Z","lastTransitionTime":"2025-10-02T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.260227 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.260261 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.260269 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.260282 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.260291 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:39Z","lastTransitionTime":"2025-10-02T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.361873 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.361932 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.361943 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.361957 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.361965 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:39Z","lastTransitionTime":"2025-10-02T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.463493 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.463521 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.463531 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.463543 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.463551 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:39Z","lastTransitionTime":"2025-10-02T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.565808 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.565829 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.565836 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.565846 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.565854 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:39Z","lastTransitionTime":"2025-10-02T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.667119 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.667169 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.667178 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.667188 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.667196 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:39Z","lastTransitionTime":"2025-10-02T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.769099 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.769130 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.769139 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.769151 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.769159 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:39Z","lastTransitionTime":"2025-10-02T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.870621 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.870651 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.870659 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.870668 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.870676 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:39Z","lastTransitionTime":"2025-10-02T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.971652 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.971671 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.971678 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.971705 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:39 crc kubenswrapper[4786]: I1002 06:47:39.971713 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:39Z","lastTransitionTime":"2025-10-02T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.073597 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.073625 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.073633 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.073643 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.073651 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:40Z","lastTransitionTime":"2025-10-02T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.175559 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.175591 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.175603 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.175614 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.175622 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:40Z","lastTransitionTime":"2025-10-02T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.178934 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.178946 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:40 crc kubenswrapper[4786]: E1002 06:47:40.179015 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.179046 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:40 crc kubenswrapper[4786]: E1002 06:47:40.179132 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.179142 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:40 crc kubenswrapper[4786]: E1002 06:47:40.179185 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:40 crc kubenswrapper[4786]: E1002 06:47:40.179265 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.188975 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:40Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.197476 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:40Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.204951 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:40Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.212754 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:40Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.219231 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:40Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.231733 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:40Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.240040 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:40Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.247526 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:40Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.254590 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h5tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d5f210-1181-43ab-b65a-12ba1f5a9255\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9611578bcef190486262cb116fbd569bedfb39989f1ff19839e06044c4d5558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9mj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h5tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:40Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.264828 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5248bee859340a1496284f12f5f40320ec7bd7f0d299f47b29eed26204d6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:34Z\\\",\\\"message\\\":\\\"2025-10-02T06:46:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_70569a91-f43b-4f6d-80fa-a81393a68a6e\\\\n2025-10-02T06:46:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_70569a91-f43b-4f6d-80fa-a81393a68a6e to /host/opt/cni/bin/\\\\n2025-10-02T06:46:49Z [verbose] multus-daemon started\\\\n2025-10-02T06:46:49Z [verbose] Readiness Indicator file check\\\\n2025-10-02T06:47:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:40Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.277100 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.277127 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.277135 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.277148 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.277156 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:40Z","lastTransitionTime":"2025-10-02T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.280061 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:18Z\\\",\\\"message\\\":\\\"terLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 06:47:18.774900 6487 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-controller-manager-operator/metrics]} name:Service_openshift-controller-manager-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDN\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:47:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:40Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.287206 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:40Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.294259 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21cf44da-d50a-47e3-882b-9fb2e9532d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db07511409f2eb1f1e135e39813df7766557cb3d8c3b62f047fc7bab3ef51f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7363758d8415da5f7895b0b5366cf40f12e700913f3587f2d094f212f5a0c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ffa594e8f05a960db6e2a76483d1a5bba96a51ae8d28b378b044af07cabcdb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:40Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.301926 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:40Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.310093 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:40Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.316773 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:40Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.325232 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:40Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.331550 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e4217c0-9581-4727-b594-adb99293f7db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8zkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:40Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.378588 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.378711 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.378788 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.378846 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.378902 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:40Z","lastTransitionTime":"2025-10-02T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.480428 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.480530 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.480593 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.480658 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.480738 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:40Z","lastTransitionTime":"2025-10-02T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.581714 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.581738 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.581745 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.581754 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.581761 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:40Z","lastTransitionTime":"2025-10-02T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.683887 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.683915 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.683925 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.683937 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.683945 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:40Z","lastTransitionTime":"2025-10-02T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.785933 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.785972 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.785980 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.785988 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.785995 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:40Z","lastTransitionTime":"2025-10-02T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.887921 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.888077 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.888168 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.888235 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.888302 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:40Z","lastTransitionTime":"2025-10-02T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.990289 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.990317 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.990326 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.990345 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:40 crc kubenswrapper[4786]: I1002 06:47:40.990353 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:40Z","lastTransitionTime":"2025-10-02T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.092449 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.092553 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.092620 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.092678 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.092772 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:41Z","lastTransitionTime":"2025-10-02T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.193943 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.193967 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.193977 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.193988 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.193996 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:41Z","lastTransitionTime":"2025-10-02T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.295557 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.295579 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.295586 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.295595 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.295601 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:41Z","lastTransitionTime":"2025-10-02T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.397259 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.397286 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.397311 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.397336 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.397344 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:41Z","lastTransitionTime":"2025-10-02T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.499089 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.499113 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.499120 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.499130 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.499138 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:41Z","lastTransitionTime":"2025-10-02T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.600480 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.600511 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.600519 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.600529 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.600537 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:41Z","lastTransitionTime":"2025-10-02T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.702703 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.702739 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.702751 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.702763 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.702772 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:41Z","lastTransitionTime":"2025-10-02T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.804379 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.804413 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.804422 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.804434 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.804442 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:41Z","lastTransitionTime":"2025-10-02T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.857878 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.857913 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.857922 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.857937 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.857949 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:41Z","lastTransitionTime":"2025-10-02T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:41 crc kubenswrapper[4786]: E1002 06:47:41.867552 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:41Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.869893 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.869925 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.869936 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.869948 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.869956 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:41Z","lastTransitionTime":"2025-10-02T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:41 crc kubenswrapper[4786]: E1002 06:47:41.878374 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:41Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.880587 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.880615 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.880641 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.880651 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.880659 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:41Z","lastTransitionTime":"2025-10-02T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:41 crc kubenswrapper[4786]: E1002 06:47:41.889233 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:41Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.891371 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.891400 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.891409 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.891420 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.891427 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:41Z","lastTransitionTime":"2025-10-02T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:41 crc kubenswrapper[4786]: E1002 06:47:41.898887 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:41Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.900850 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.900877 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.900886 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.900895 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.900903 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:41Z","lastTransitionTime":"2025-10-02T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:41 crc kubenswrapper[4786]: E1002 06:47:41.909037 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:41Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:41 crc kubenswrapper[4786]: E1002 06:47:41.909137 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.910019 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.910044 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.910053 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.910062 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:41 crc kubenswrapper[4786]: I1002 06:47:41.910071 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:41Z","lastTransitionTime":"2025-10-02T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.011370 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.011392 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.011401 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.011410 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.011417 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:42Z","lastTransitionTime":"2025-10-02T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.112974 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.113003 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.113013 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.113024 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.113033 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:42Z","lastTransitionTime":"2025-10-02T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.178715 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.178732 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:42 crc kubenswrapper[4786]: E1002 06:47:42.178818 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.178842 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.178886 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:42 crc kubenswrapper[4786]: E1002 06:47:42.178993 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:42 crc kubenswrapper[4786]: E1002 06:47:42.179036 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:42 crc kubenswrapper[4786]: E1002 06:47:42.179074 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.214598 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.214628 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.214639 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.214649 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.214656 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:42Z","lastTransitionTime":"2025-10-02T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.315984 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.316028 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.316039 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.316051 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.316061 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:42Z","lastTransitionTime":"2025-10-02T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.417761 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.417792 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.417802 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.417814 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.417822 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:42Z","lastTransitionTime":"2025-10-02T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.519809 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.519836 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.519845 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.519855 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.519864 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:42Z","lastTransitionTime":"2025-10-02T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.621703 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.621742 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.621751 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.621760 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.621767 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:42Z","lastTransitionTime":"2025-10-02T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.723568 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.723595 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.723604 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.723615 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.723625 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:42Z","lastTransitionTime":"2025-10-02T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.825091 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.825121 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.825129 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.825139 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.825146 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:42Z","lastTransitionTime":"2025-10-02T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.926643 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.926675 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.926705 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.926718 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:42 crc kubenswrapper[4786]: I1002 06:47:42.926727 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:42Z","lastTransitionTime":"2025-10-02T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.028052 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.028075 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.028084 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.028094 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.028101 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:43Z","lastTransitionTime":"2025-10-02T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.130048 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.130074 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.130081 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.130093 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.130101 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:43Z","lastTransitionTime":"2025-10-02T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.231502 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.231526 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.231535 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.231545 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.231552 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:43Z","lastTransitionTime":"2025-10-02T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.333043 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.333073 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.333081 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.333109 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.333118 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:43Z","lastTransitionTime":"2025-10-02T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.434882 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.434919 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.434927 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.434937 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.434944 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:43Z","lastTransitionTime":"2025-10-02T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.536555 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.536616 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.536626 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.536635 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.536642 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:43Z","lastTransitionTime":"2025-10-02T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.638897 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.638958 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.638967 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.638976 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.638983 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:43Z","lastTransitionTime":"2025-10-02T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.740628 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.740660 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.740669 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.740681 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.740708 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:43Z","lastTransitionTime":"2025-10-02T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.841943 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.841987 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.841995 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.842005 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.842012 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:43Z","lastTransitionTime":"2025-10-02T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.944032 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.944183 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.944250 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.944341 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:43 crc kubenswrapper[4786]: I1002 06:47:43.944403 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:43Z","lastTransitionTime":"2025-10-02T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.046304 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.046334 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.046343 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.046355 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.046363 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:44Z","lastTransitionTime":"2025-10-02T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.148206 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.148229 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.148238 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.148248 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.148259 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:44Z","lastTransitionTime":"2025-10-02T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.178856 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.178940 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:44 crc kubenswrapper[4786]: E1002 06:47:44.179032 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.178865 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.178867 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:44 crc kubenswrapper[4786]: E1002 06:47:44.179208 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:44 crc kubenswrapper[4786]: E1002 06:47:44.179277 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:44 crc kubenswrapper[4786]: E1002 06:47:44.179349 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.249888 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.250232 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.250309 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.250365 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.250417 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:44Z","lastTransitionTime":"2025-10-02T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.351839 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.351867 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.351876 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.351887 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.351895 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:44Z","lastTransitionTime":"2025-10-02T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.453759 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.453783 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.453792 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.453803 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.453810 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:44Z","lastTransitionTime":"2025-10-02T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.555366 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.555383 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.555391 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.555400 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.555406 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:44Z","lastTransitionTime":"2025-10-02T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.657040 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.657067 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.657076 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.657085 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.657123 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:44Z","lastTransitionTime":"2025-10-02T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.758728 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.758745 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.758753 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.758765 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.758774 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:44Z","lastTransitionTime":"2025-10-02T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.860540 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.860567 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.860577 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.860587 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.860595 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:44Z","lastTransitionTime":"2025-10-02T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.962454 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.962478 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.962486 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.962496 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:44 crc kubenswrapper[4786]: I1002 06:47:44.962503 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:44Z","lastTransitionTime":"2025-10-02T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.063715 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.063736 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.063744 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.063755 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.063762 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:45Z","lastTransitionTime":"2025-10-02T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.164800 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.164820 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.164828 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.164837 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.164846 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:45Z","lastTransitionTime":"2025-10-02T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.266157 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.266185 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.266195 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.266208 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.266217 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:45Z","lastTransitionTime":"2025-10-02T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.368150 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.368179 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.368189 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.368201 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.368211 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:45Z","lastTransitionTime":"2025-10-02T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.469753 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.469781 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.469788 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.469797 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.469806 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:45Z","lastTransitionTime":"2025-10-02T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.571398 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.571428 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.571436 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.571448 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.571457 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:45Z","lastTransitionTime":"2025-10-02T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.672480 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.672508 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.672515 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.672526 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.672533 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:45Z","lastTransitionTime":"2025-10-02T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.774176 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.774197 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.774204 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.774213 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.774220 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:45Z","lastTransitionTime":"2025-10-02T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.875552 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.875607 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.875617 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.875631 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.875640 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:45Z","lastTransitionTime":"2025-10-02T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.976814 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.976850 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.976859 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.976870 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:45 crc kubenswrapper[4786]: I1002 06:47:45.976881 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:45Z","lastTransitionTime":"2025-10-02T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.078036 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.078067 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.078099 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.078112 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.078119 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:46Z","lastTransitionTime":"2025-10-02T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.178957 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:46 crc kubenswrapper[4786]: E1002 06:47:46.179045 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.179080 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.179119 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.178968 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:46 crc kubenswrapper[4786]: E1002 06:47:46.179159 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:46 crc kubenswrapper[4786]: E1002 06:47:46.179177 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:46 crc kubenswrapper[4786]: E1002 06:47:46.179290 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.179997 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.180024 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.180033 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.180045 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.180058 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:46Z","lastTransitionTime":"2025-10-02T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.283629 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.283664 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.283674 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.283686 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.283708 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:46Z","lastTransitionTime":"2025-10-02T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.385438 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.385469 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.385478 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.385489 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.385498 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:46Z","lastTransitionTime":"2025-10-02T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.487316 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.487344 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.487352 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.487362 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.487370 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:46Z","lastTransitionTime":"2025-10-02T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.589348 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.589380 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.589388 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.589399 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.589409 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:46Z","lastTransitionTime":"2025-10-02T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.691262 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.691288 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.691297 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.691306 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.691313 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:46Z","lastTransitionTime":"2025-10-02T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.793088 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.793118 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.793126 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.793138 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.793146 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:46Z","lastTransitionTime":"2025-10-02T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.894540 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.894566 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.894574 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.894585 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.894592 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:46Z","lastTransitionTime":"2025-10-02T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.996546 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.996577 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.996588 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.996599 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:46 crc kubenswrapper[4786]: I1002 06:47:46.996607 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:46Z","lastTransitionTime":"2025-10-02T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.098266 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.098305 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.098313 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.098326 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.098334 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:47Z","lastTransitionTime":"2025-10-02T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.185411 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.199471 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.199502 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.199511 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.199522 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.199531 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:47Z","lastTransitionTime":"2025-10-02T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.300944 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.300982 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.300994 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.301007 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.301018 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:47Z","lastTransitionTime":"2025-10-02T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.402835 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.402861 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.402870 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.402882 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.402890 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:47Z","lastTransitionTime":"2025-10-02T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.504414 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.504451 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.504462 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.504475 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.504485 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:47Z","lastTransitionTime":"2025-10-02T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.606085 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.606116 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.606125 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.606136 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.606144 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:47Z","lastTransitionTime":"2025-10-02T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.707855 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.707885 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.707894 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.707903 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.707911 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:47Z","lastTransitionTime":"2025-10-02T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.809629 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.809658 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.809667 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.809677 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.809707 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:47Z","lastTransitionTime":"2025-10-02T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.911034 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.911054 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.911062 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.911071 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:47 crc kubenswrapper[4786]: I1002 06:47:47.911078 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:47Z","lastTransitionTime":"2025-10-02T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.012663 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.012712 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.012722 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.012735 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.012744 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:48Z","lastTransitionTime":"2025-10-02T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.114211 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.114237 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.114245 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.114254 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.114262 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:48Z","lastTransitionTime":"2025-10-02T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.178252 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.178316 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:48 crc kubenswrapper[4786]: E1002 06:47:48.178415 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.178443 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.178491 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:48 crc kubenswrapper[4786]: E1002 06:47:48.178545 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:48 crc kubenswrapper[4786]: E1002 06:47:48.178658 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:48 crc kubenswrapper[4786]: E1002 06:47:48.178937 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.179088 4786 scope.go:117] "RemoveContainer" containerID="f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.215793 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.215824 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.215833 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.215844 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.215852 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:48Z","lastTransitionTime":"2025-10-02T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.317581 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.317613 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.317620 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.317632 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.317641 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:48Z","lastTransitionTime":"2025-10-02T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.419850 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.420052 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.420061 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.420074 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.420090 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:48Z","lastTransitionTime":"2025-10-02T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.433376 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bgs8z_894eab78-90cf-4975-aa45-223332e04f5c/ovnkube-controller/2.log" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.435525 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerStarted","Data":"7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07"} Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.435881 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.452549 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:18Z\\\",\\\"message\\\":\\\"terLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 06:47:18.774900 6487 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-controller-manager-operator/metrics]} name:Service_openshift-controller-manager-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDN\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:47:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:48Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.464841 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:48Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.471635 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93325b6b-2df3-4eb1-80a2-e9bc3f4651b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249e201a7368ce9d1d48889790f03a8613db147b5af37b04d237769fd6c29267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ca981336ba9c89b6bad8fc5575ffc2c952224eb8110900cf22038176e2186e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ca981336ba9c89b6bad8fc5575ffc2c952224eb8110900cf22038176e2186e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:48Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.480579 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5248bee859340a1496284f12f5f40320ec7bd7f0d299f47b29eed26204d6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:34Z\\\",\\\"message\\\":\\\"2025-10-02T06:46:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_70569a91-f43b-4f6d-80fa-a81393a68a6e\\\\n2025-10-02T06:46:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_70569a91-f43b-4f6d-80fa-a81393a68a6e to /host/opt/cni/bin/\\\\n2025-10-02T06:46:49Z [verbose] multus-daemon started\\\\n2025-10-02T06:46:49Z [verbose] Readiness Indicator file check\\\\n2025-10-02T06:47:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:48Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.488057 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:48Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.495610 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:48Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.504610 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:48Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.511637 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e4217c0-9581-4727-b594-adb99293f7db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8zkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:48Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.518723 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21cf44da-d50a-47e3-882b-9fb2e9532d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db07511409f2eb1f1e135e39813df7766557cb3d8c3b62f047fc7bab3ef51f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7363758d8415da5f7895b0b5366cf40f12e700913f3587f2d094f212f5a0c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ffa594e8f05a960db6e2a76483d1a5bba96a51ae8d28b378b044af07cabcdb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:48Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.521447 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.521465 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.521474 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.521486 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.521494 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:48Z","lastTransitionTime":"2025-10-02T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.527415 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:48Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.534572 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:48Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.542261 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:48Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.548910 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:48Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.557901 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:48Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.566356 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:48Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.578076 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:48Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.589371 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h5tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d5f210-1181-43ab-b65a-12ba1f5a9255\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9611578bcef190486262cb116fbd569bedfb39989f1ff19839e06044c4d5558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9mj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h5tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:48Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.605881 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:48Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.615398 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:48Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.623209 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.623237 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.623289 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.623302 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.623310 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:48Z","lastTransitionTime":"2025-10-02T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.725392 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.725423 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.725432 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.725444 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.725453 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:48Z","lastTransitionTime":"2025-10-02T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.827441 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.827579 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.827636 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.827717 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.827788 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:48Z","lastTransitionTime":"2025-10-02T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.929271 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.929306 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.929316 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.929329 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:48 crc kubenswrapper[4786]: I1002 06:47:48.929337 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:48Z","lastTransitionTime":"2025-10-02T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.031321 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.031353 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.031384 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.031397 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.031407 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:49Z","lastTransitionTime":"2025-10-02T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.132731 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.132757 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.132766 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.132775 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.132782 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:49Z","lastTransitionTime":"2025-10-02T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.234115 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.234148 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.234168 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.234181 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.234189 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:49Z","lastTransitionTime":"2025-10-02T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.335577 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.335605 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.335613 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.335624 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.335632 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:49Z","lastTransitionTime":"2025-10-02T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.437478 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.437505 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.437512 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.437522 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.437529 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:49Z","lastTransitionTime":"2025-10-02T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.439117 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bgs8z_894eab78-90cf-4975-aa45-223332e04f5c/ovnkube-controller/3.log" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.439656 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bgs8z_894eab78-90cf-4975-aa45-223332e04f5c/ovnkube-controller/2.log" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.442642 4786 generic.go:334] "Generic (PLEG): container finished" podID="894eab78-90cf-4975-aa45-223332e04f5c" containerID="7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07" exitCode=1 Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.442670 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerDied","Data":"7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07"} Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.442709 4786 scope.go:117] "RemoveContainer" containerID="f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.443119 4786 scope.go:117] "RemoveContainer" containerID="7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07" Oct 02 06:47:49 crc kubenswrapper[4786]: E1002 06:47:49.443253 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" podUID="894eab78-90cf-4975-aa45-223332e04f5c" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.456130 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:18Z\\\",\\\"message\\\":\\\"terLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 06:47:18.774900 6487 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-controller-manager-operator/metrics]} name:Service_openshift-controller-manager-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDN\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:47:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:48Z\\\",\\\"message\\\":\\\"1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1002 06:47:48.781515 6863 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI1002 06:47:48.781519 6863 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-g5lv2 in node crc\\\\nI1002 06:47:48.781523 6863 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1002 06:47:48.781531 6863 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF1002 06:47:48.781538 6863 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?ti\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.464073 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.470537 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93325b6b-2df3-4eb1-80a2-e9bc3f4651b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249e201a7368ce9d1d48889790f03a8613db147b5af37b04d237769fd6c29267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ca981336ba9c89b6bad8fc5575ffc2c952224eb8110900cf22038176e2186e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ca981336ba9c89b6bad8fc5575ffc2c952224eb8110900cf22038176e2186e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.478367 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5248bee859340a1496284f12f5f40320ec7bd7f0d299f47b29eed26204d6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:34Z\\\",\\\"message\\\":\\\"2025-10-02T06:46:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_70569a91-f43b-4f6d-80fa-a81393a68a6e\\\\n2025-10-02T06:46:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_70569a91-f43b-4f6d-80fa-a81393a68a6e to /host/opt/cni/bin/\\\\n2025-10-02T06:46:49Z [verbose] multus-daemon started\\\\n2025-10-02T06:46:49Z [verbose] Readiness Indicator file check\\\\n2025-10-02T06:47:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.486568 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.493855 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.503759 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.510418 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e4217c0-9581-4727-b594-adb99293f7db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8zkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.517781 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21cf44da-d50a-47e3-882b-9fb2e9532d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db07511409f2eb1f1e135e39813df7766557cb3d8c3b62f047fc7bab3ef51f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7363758d8415da5f7895b0b5366cf40f12e700913f3587f2d094f212f5a0c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ffa594e8f05a960db6e2a76483d1a5bba96a51ae8d28b378b044af07cabcdb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.525009 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.531941 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.539523 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.539557 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.539565 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.539576 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.539586 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:49Z","lastTransitionTime":"2025-10-02T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.539553 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.546166 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.554717 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.562418 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.569885 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.576528 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h5tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d5f210-1181-43ab-b65a-12ba1f5a9255\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9611578bcef190486262cb116fbd569bedfb39989f1ff19839e06044c4d5558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9mj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h5tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.590793 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.599330 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:49Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.641319 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.641350 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.641360 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.641372 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.641381 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:49Z","lastTransitionTime":"2025-10-02T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.742841 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.742871 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.742880 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.742891 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.742899 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:49Z","lastTransitionTime":"2025-10-02T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.844598 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.844630 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.844638 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.844651 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.844660 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:49Z","lastTransitionTime":"2025-10-02T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.946648 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.946704 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.946714 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.946726 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:49 crc kubenswrapper[4786]: I1002 06:47:49.946735 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:49Z","lastTransitionTime":"2025-10-02T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.048463 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.048497 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.048506 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.048519 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.048527 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:50Z","lastTransitionTime":"2025-10-02T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.150007 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.150036 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.150044 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.150055 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.150063 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:50Z","lastTransitionTime":"2025-10-02T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.178468 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.178496 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.178524 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:50 crc kubenswrapper[4786]: E1002 06:47:50.178567 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:50 crc kubenswrapper[4786]: E1002 06:47:50.178628 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:50 crc kubenswrapper[4786]: E1002 06:47:50.178715 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.178856 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:50 crc kubenswrapper[4786]: E1002 06:47:50.178925 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.187943 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.195870 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.203821 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.212434 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.220981 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.228757 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.235782 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h5tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d5f210-1181-43ab-b65a-12ba1f5a9255\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9611578bcef190486262cb116fbd569bedfb39989f1ff19839e06044c4d5558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9mj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h5tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.247980 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.251478 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.251501 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.251529 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.251541 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.251552 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:50Z","lastTransitionTime":"2025-10-02T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.256174 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.268299 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6cd2450b07cf99f62dd5804e01789177c670d8afb4638742b885749bf5db089\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:18Z\\\",\\\"message\\\":\\\"terLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-controller-manager-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-controller-manager-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.58\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 06:47:18.774900 6487 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-controller-manager-operator/metrics]} name:Service_openshift-controller-manager-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDN\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:47:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:48Z\\\",\\\"message\\\":\\\"1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1002 06:47:48.781515 6863 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI1002 06:47:48.781519 6863 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-g5lv2 in node crc\\\\nI1002 06:47:48.781523 6863 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1002 06:47:48.781531 6863 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF1002 06:47:48.781538 6863 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?ti\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:47:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.275270 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.281341 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93325b6b-2df3-4eb1-80a2-e9bc3f4651b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249e201a7368ce9d1d48889790f03a8613db147b5af37b04d237769fd6c29267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ca981336ba9c89b6bad8fc5575ffc2c952224eb8110900cf22038176e2186e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ca981336ba9c89b6bad8fc5575ffc2c952224eb8110900cf22038176e2186e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.288618 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5248bee859340a1496284f12f5f40320ec7bd7f0d299f47b29eed26204d6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:34Z\\\",\\\"message\\\":\\\"2025-10-02T06:46:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_70569a91-f43b-4f6d-80fa-a81393a68a6e\\\\n2025-10-02T06:46:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_70569a91-f43b-4f6d-80fa-a81393a68a6e to /host/opt/cni/bin/\\\\n2025-10-02T06:46:49Z [verbose] multus-daemon started\\\\n2025-10-02T06:46:49Z [verbose] Readiness Indicator file check\\\\n2025-10-02T06:47:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.296361 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.304234 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.313020 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.320016 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e4217c0-9581-4727-b594-adb99293f7db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8zkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.326929 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21cf44da-d50a-47e3-882b-9fb2e9532d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db07511409f2eb1f1e135e39813df7766557cb3d8c3b62f047fc7bab3ef51f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7363758d8415da5f7895b0b5366cf40f12e700913f3587f2d094f212f5a0c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ffa594e8f05a960db6e2a76483d1a5bba96a51ae8d28b378b044af07cabcdb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.334878 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.353086 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.353116 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.353125 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.353147 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.353157 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:50Z","lastTransitionTime":"2025-10-02T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.446649 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bgs8z_894eab78-90cf-4975-aa45-223332e04f5c/ovnkube-controller/3.log" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.449224 4786 scope.go:117] "RemoveContainer" containerID="7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07" Oct 02 06:47:50 crc kubenswrapper[4786]: E1002 06:47:50.449462 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" podUID="894eab78-90cf-4975-aa45-223332e04f5c" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.454275 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.454312 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.454323 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.454337 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.454346 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:50Z","lastTransitionTime":"2025-10-02T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.457972 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e4217c0-9581-4727-b594-adb99293f7db\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ldt5z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-p8zkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.465934 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21cf44da-d50a-47e3-882b-9fb2e9532d4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db07511409f2eb1f1e135e39813df7766557cb3d8c3b62f047fc7bab3ef51f38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7363758d8415da5f7895b0b5366cf40f12e700913f3587f2d094f212f5a0c055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ffa594e8f05a960db6e2a76483d1a5bba96a51ae8d28b378b044af07cabcdb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://301da38adc54b59320a9b68d33ca0813d0fcffab16c6d1c463e3ff4e00c80738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.474103 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.482925 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ec984c15a10b5a56eac035e9ade03dfbfdec7dd4458252195a1f08a3d0e63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.490674 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"79cb22df-4930-4aed-9108-1056074d1000\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08922337dadd45c64f67d7e7df5c5e6ea6867a4dcd49ec3f040b2bf2344bf58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gfbjt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-p6dmq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.499935 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-r5k86" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b625fb23-ba7e-4931-b753-94dc23e8effa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9570f2b20557ac145c8897fce001edd36898beed300e96762ffa5f4706306777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e1e0a47e5c98af5f2bb0a64d88ba5f20416a253860e0fb8334ed4b66c901f27b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94b2c815b1672961768820da88f91923bb0b0e04186ca0911bb513072fb9367d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://432886276f8c802cd822fc1236d071217bf57d7676ab6b9adbbf00ac10c1ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e7993a4f3c76c9deaba0bf8127a5173cf95c2b18225aaa8dad35062a09b980\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://164c970358c0f09148f542825f1f9ef8becc386cbf32c80e1ddff9f6bd141a77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eeca970fc84eb770876ed91c515a265d99e34a79725dafaafeb9e31a5ce01514\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdwpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-r5k86\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.508305 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01ec2d9-0876-4b53-92f3-b17748e3691a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d51ad56dc8d1f70be2c6348fc4ada40a445e1cffbc0e0cfe0a3f22b37cc8e81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf882166667a5ad6a4b3e6911f4aa25c60852e2f6ec0c169e11cc4c13e2c400\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82280efa7aa3e92fc4af157b73c2b98f92245c8516aba512d431759b564eb948\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8f5494ff5a6eb032932be95d063ce6e41f3dc673f32076e4958d493f6f9eb9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8e0f6082a015e040543e0d4de6a30879f65ff1526e727535889b4df3851c71e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T06:46:47Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1002 06:46:42.191115 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 06:46:42.192665 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411567118/tls.crt::/tmp/serving-cert-1411567118/tls.key\\\\\\\"\\\\nI1002 06:46:47.573246 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 06:46:47.575063 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 06:46:47.575080 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 06:46:47.575100 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 06:46:47.575104 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 06:46:47.579106 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 06:46:47.579124 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1002 06:46:47.579126 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1002 06:46:47.579129 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 06:46:47.579142 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 06:46:47.579146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 06:46:47.579149 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 06:46:47.579152 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1002 06:46:47.580058 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecda478a0c8efbd20efa5411af59b3410d06f4d54e768400129c759f1b6bd9dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0455500d2adaa075725823521e6e0a1e71a8953bf2dd54bad6239b909c324d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.516740 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24569544-ccca-4d26-9929-5c7188492bc2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13c8a78c3b5cf58fbfbf061a1958f70fe2f11c6fcc00b668fe58042e56efcec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9f78e92375e773fbcc0227f4a13cff080bd2942b0dbbb8879a5c9daf5191399\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9c894c27c2832a0967da8e6eead5f1d9eb1d6c9a4a344bddef59b7f9d6bb661\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da025762d43cdb825dc2f43bffec148328691b63d6f6a8d3d5580c588dec3ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.523704 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://699bb0ef2fc1cb36d33c0301e326b2631728f7015809ff1a1eb0862a532426d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.531844 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d8f187cbe593bac401c7cd6fd70a5bcbfc9d43bb5fefb60ac1a7e494f7c3ffd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c48f2b060ac393fda4498926dd5e7eac877b0a2c7d0f72351fbad4d058ef7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.538559 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g5lv2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f634be-a80e-4770-a408-a258fd303dee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72fddf7b80795f01dd9a861e38c1b41f66901672d94bc78b21a6e85cac8ac3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sx8n4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g5lv2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.550760 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b18d351-b9e5-4763-ad43-f2c4cf938669\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76e48eb326254f0d2181b4b9fea279e61ab9025d56174d1e1996704198e76ea8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://56a7057fb2207f465f8a8728271ab6322341370cea76c4f39cc1d05468d7ed13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99b8f4ad6b0b87219456836e558752ca3af4caaadd5b2741d6dc974fbc94f40b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc386f06a3c11191cca49f26e5f64df29cfb4334df3f5b1fb14352c1d0b48ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://259d96b6f7f25176fcd7a04fdcc61a18e388bdfe4960f9a583ed0f1cbebf2c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ed18cf498b783dcbb25591814c9ad6f3dd5aaf3fe4491d3f1af8436906aa887\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://877b563226f233520350880653fe2da8400054813ac654a6bd8f9fd08e190379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0cc1c6b6976b46034f455d111f93af51abd6a1418dce5055531100c7feca769\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.556519 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.556541 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.556549 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.556561 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.556569 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:50Z","lastTransitionTime":"2025-10-02T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.560266 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.568202 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.574579 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9h5tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09d5f210-1181-43ab-b65a-12ba1f5a9255\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9611578bcef190486262cb116fbd569bedfb39989f1ff19839e06044c4d5558\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g9mj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:47:00Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9h5tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.581430 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93325b6b-2df3-4eb1-80a2-e9bc3f4651b1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249e201a7368ce9d1d48889790f03a8613db147b5af37b04d237769fd6c29267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ca981336ba9c89b6bad8fc5575ffc2c952224eb8110900cf22038176e2186e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ca981336ba9c89b6bad8fc5575ffc2c952224eb8110900cf22038176e2186e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.589772 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7hgkl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de8dcd53-84d9-422e-8f18-63ea8ea75bd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5248bee859340a1496284f12f5f40320ec7bd7f0d299f47b29eed26204d6fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:34Z\\\",\\\"message\\\":\\\"2025-10-02T06:46:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_70569a91-f43b-4f6d-80fa-a81393a68a6e\\\\n2025-10-02T06:46:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_70569a91-f43b-4f6d-80fa-a81393a68a6e to /host/opt/cni/bin/\\\\n2025-10-02T06:46:49Z [verbose] multus-daemon started\\\\n2025-10-02T06:46:49Z [verbose] Readiness Indicator file check\\\\n2025-10-02T06:47:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6gtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7hgkl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.601610 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"894eab78-90cf-4975-aa45-223332e04f5c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T06:47:48Z\\\",\\\"message\\\":\\\"1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1002 06:47:48.781515 6863 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI1002 06:47:48.781519 6863 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-g5lv2 in node crc\\\\nI1002 06:47:48.781523 6863 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1002 06:47:48.781531 6863 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nF1002 06:47:48.781538 6863 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?ti\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T06:47:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nsb84\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bgs8z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.608719 4786 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"994073bb-c2bd-4b85-8807-91891f492145\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:46:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d013d22682f592f8b0c19d91936c3ac6d1b2984ad478158e2cbe31c4d76d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3ca6123d7afd007b6fd70b394fd00edd8da0213ed55f8cb5d75940aaeec4f81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T06:46:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-94spq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T06:46:59Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-2v2ds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:50Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.658614 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.658651 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.658664 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.658678 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.658686 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:50Z","lastTransitionTime":"2025-10-02T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.760895 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.760923 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.760932 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.760943 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.760952 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:50Z","lastTransitionTime":"2025-10-02T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.862627 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.862658 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.862666 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.862678 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.862701 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:50Z","lastTransitionTime":"2025-10-02T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.964466 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.964498 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.964506 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.964519 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:50 crc kubenswrapper[4786]: I1002 06:47:50.964527 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:50Z","lastTransitionTime":"2025-10-02T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.066610 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.066643 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.066651 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.066664 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.066673 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:51Z","lastTransitionTime":"2025-10-02T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.167903 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.167936 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.167947 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.167962 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.167972 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:51Z","lastTransitionTime":"2025-10-02T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.269911 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.269942 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.269952 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.269964 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.269973 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:51Z","lastTransitionTime":"2025-10-02T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.372202 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.372634 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.372711 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.372789 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.372869 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:51Z","lastTransitionTime":"2025-10-02T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.474520 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.474556 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.474565 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.474577 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.474589 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:51Z","lastTransitionTime":"2025-10-02T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.576371 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.576525 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.576614 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.576725 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.576921 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:51Z","lastTransitionTime":"2025-10-02T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.679095 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.679234 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.679319 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.679377 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.679435 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:51Z","lastTransitionTime":"2025-10-02T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.781437 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.781475 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.781485 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.781499 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.781508 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:51Z","lastTransitionTime":"2025-10-02T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.883448 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.883481 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.883491 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.883504 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.883514 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:51Z","lastTransitionTime":"2025-10-02T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.955417 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.955463 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.955472 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.955484 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.955492 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:51Z","lastTransitionTime":"2025-10-02T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:51 crc kubenswrapper[4786]: E1002 06:47:51.965585 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.967919 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.967947 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.967957 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.967968 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.967976 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:51Z","lastTransitionTime":"2025-10-02T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:51 crc kubenswrapper[4786]: E1002 06:47:51.975998 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.978276 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.978299 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.978306 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.978316 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.978322 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:51Z","lastTransitionTime":"2025-10-02T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:51 crc kubenswrapper[4786]: E1002 06:47:51.986179 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.988138 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.988158 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.988166 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.988175 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.988181 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:51Z","lastTransitionTime":"2025-10-02T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:51 crc kubenswrapper[4786]: E1002 06:47:51.995337 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:51Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.997371 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.997393 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.997401 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.997410 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:51 crc kubenswrapper[4786]: I1002 06:47:51.997417 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:51Z","lastTransitionTime":"2025-10-02T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:52 crc kubenswrapper[4786]: E1002 06:47:52.004739 4786 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T06:47:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"32255bd5-d69e-4921-8286-62a7cb990a56\\\",\\\"systemUUID\\\":\\\"21a64352-c064-4ec3-ab53-f6b24546cab3\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T06:47:52Z is after 2025-08-24T17:21:41Z" Oct 02 06:47:52 crc kubenswrapper[4786]: E1002 06:47:52.004837 4786 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.005658 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.005682 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.005704 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.005713 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.005720 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:52Z","lastTransitionTime":"2025-10-02T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.022185 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:47:52 crc kubenswrapper[4786]: E1002 06:47:52.022262 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:56.022245666 +0000 UTC m=+146.143428807 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.022370 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.022454 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:52 crc kubenswrapper[4786]: E1002 06:47:52.022523 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 06:47:52 crc kubenswrapper[4786]: E1002 06:47:52.022530 4786 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 06:47:52 crc kubenswrapper[4786]: E1002 06:47:52.022545 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 06:47:52 crc kubenswrapper[4786]: E1002 06:47:52.022557 4786 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:47:52 crc kubenswrapper[4786]: E1002 06:47:52.022584 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 06:48:56.022569119 +0000 UTC m=+146.143752260 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 06:47:52 crc kubenswrapper[4786]: E1002 06:47:52.022603 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 06:48:56.022595498 +0000 UTC m=+146.143778639 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:47:52 crc kubenswrapper[4786]: E1002 06:47:52.022670 4786 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 06:47:52 crc kubenswrapper[4786]: E1002 06:47:52.022740 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 06:48:56.022722415 +0000 UTC m=+146.143905556 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.022792 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.106751 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.106781 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.106792 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.106807 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.106817 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:52Z","lastTransitionTime":"2025-10-02T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.124105 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:52 crc kubenswrapper[4786]: E1002 06:47:52.124230 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 06:47:52 crc kubenswrapper[4786]: E1002 06:47:52.124256 4786 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 06:47:52 crc kubenswrapper[4786]: E1002 06:47:52.124268 4786 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:47:52 crc kubenswrapper[4786]: E1002 06:47:52.124308 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 06:48:56.124297185 +0000 UTC m=+146.245480326 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.178671 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.178725 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.178681 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.178749 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:52 crc kubenswrapper[4786]: E1002 06:47:52.178801 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:52 crc kubenswrapper[4786]: E1002 06:47:52.178891 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:52 crc kubenswrapper[4786]: E1002 06:47:52.178930 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:52 crc kubenswrapper[4786]: E1002 06:47:52.178972 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.208502 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.208526 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.208534 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.208545 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.208553 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:52Z","lastTransitionTime":"2025-10-02T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.310578 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.310623 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.310631 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.310644 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.310655 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:52Z","lastTransitionTime":"2025-10-02T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.412669 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.412716 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.412726 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.412742 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.412751 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:52Z","lastTransitionTime":"2025-10-02T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.514244 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.514456 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.514515 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.514570 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.514625 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:52Z","lastTransitionTime":"2025-10-02T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.616233 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.616281 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.616288 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.616300 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.616310 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:52Z","lastTransitionTime":"2025-10-02T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.717981 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.718033 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.718043 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.718064 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.718073 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:52Z","lastTransitionTime":"2025-10-02T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.819979 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.820009 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.820018 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.820028 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.820036 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:52Z","lastTransitionTime":"2025-10-02T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.921825 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.921880 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.921890 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.921903 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:52 crc kubenswrapper[4786]: I1002 06:47:52.921912 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:52Z","lastTransitionTime":"2025-10-02T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.023975 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.024010 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.024018 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.024031 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.024039 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:53Z","lastTransitionTime":"2025-10-02T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.125855 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.125890 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.125899 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.125911 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.125920 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:53Z","lastTransitionTime":"2025-10-02T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.227541 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.227569 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.227578 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.227591 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.227599 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:53Z","lastTransitionTime":"2025-10-02T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.329042 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.329094 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.329104 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.329118 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.329128 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:53Z","lastTransitionTime":"2025-10-02T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.430879 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.430915 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.430924 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.430936 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.430944 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:53Z","lastTransitionTime":"2025-10-02T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.533162 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.533196 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.533206 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.533220 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.533230 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:53Z","lastTransitionTime":"2025-10-02T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.634553 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.634586 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.634596 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.634610 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.634618 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:53Z","lastTransitionTime":"2025-10-02T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.736029 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.736074 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.736083 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.736094 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.736103 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:53Z","lastTransitionTime":"2025-10-02T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.838086 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.838123 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.838131 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.838146 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.838154 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:53Z","lastTransitionTime":"2025-10-02T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.940235 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.940277 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.940289 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.940306 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:53 crc kubenswrapper[4786]: I1002 06:47:53.940321 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:53Z","lastTransitionTime":"2025-10-02T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.041977 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.042025 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.042037 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.042061 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.042072 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:54Z","lastTransitionTime":"2025-10-02T06:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.143987 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.144022 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.144031 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.144051 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.144060 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:54Z","lastTransitionTime":"2025-10-02T06:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.178278 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.178322 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.178349 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.178289 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:54 crc kubenswrapper[4786]: E1002 06:47:54.178383 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:54 crc kubenswrapper[4786]: E1002 06:47:54.178461 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:54 crc kubenswrapper[4786]: E1002 06:47:54.178602 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:54 crc kubenswrapper[4786]: E1002 06:47:54.178630 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.246008 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.246050 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.246059 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.246070 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.246078 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:54Z","lastTransitionTime":"2025-10-02T06:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.347942 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.347979 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.347987 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.348002 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.348010 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:54Z","lastTransitionTime":"2025-10-02T06:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.449789 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.449814 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.449823 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.449841 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.449848 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:54Z","lastTransitionTime":"2025-10-02T06:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.551610 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.551637 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.551646 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.551656 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.551664 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:54Z","lastTransitionTime":"2025-10-02T06:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.653263 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.653400 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.653464 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.653529 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.653590 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:54Z","lastTransitionTime":"2025-10-02T06:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.756352 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.756381 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.756390 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.756402 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.756428 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:54Z","lastTransitionTime":"2025-10-02T06:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.858466 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.858496 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.858506 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.858534 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.858545 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:54Z","lastTransitionTime":"2025-10-02T06:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.960412 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.960460 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.960491 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.960504 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:54 crc kubenswrapper[4786]: I1002 06:47:54.960513 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:54Z","lastTransitionTime":"2025-10-02T06:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.062642 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.062715 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.062726 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.062736 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.062743 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:55Z","lastTransitionTime":"2025-10-02T06:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.163958 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.163984 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.163992 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.164019 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.164042 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:55Z","lastTransitionTime":"2025-10-02T06:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.265055 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.265088 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.265097 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.265110 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.265119 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:55Z","lastTransitionTime":"2025-10-02T06:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.366445 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.366472 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.366481 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.366490 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.366497 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:55Z","lastTransitionTime":"2025-10-02T06:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.467532 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.467578 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.467588 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.467600 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.467609 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:55Z","lastTransitionTime":"2025-10-02T06:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.569486 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.569517 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.569526 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.569539 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.569547 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:55Z","lastTransitionTime":"2025-10-02T06:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.671745 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.671781 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.671790 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.671824 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.671837 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:55Z","lastTransitionTime":"2025-10-02T06:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.773469 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.773496 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.773504 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.773530 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.773541 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:55Z","lastTransitionTime":"2025-10-02T06:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.875364 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.875410 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.875420 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.875429 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.875438 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:55Z","lastTransitionTime":"2025-10-02T06:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.977059 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.977089 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.977097 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.977107 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:55 crc kubenswrapper[4786]: I1002 06:47:55.977114 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:55Z","lastTransitionTime":"2025-10-02T06:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.079097 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.079129 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.079139 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.079151 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.079160 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:56Z","lastTransitionTime":"2025-10-02T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.178319 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.178358 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:56 crc kubenswrapper[4786]: E1002 06:47:56.178424 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.178337 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.178450 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:56 crc kubenswrapper[4786]: E1002 06:47:56.178562 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:56 crc kubenswrapper[4786]: E1002 06:47:56.178630 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:56 crc kubenswrapper[4786]: E1002 06:47:56.178763 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.180725 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.180750 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.180758 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.180767 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.180815 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:56Z","lastTransitionTime":"2025-10-02T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.282117 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.282456 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.282475 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.282489 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.282498 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:56Z","lastTransitionTime":"2025-10-02T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.384194 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.384223 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.384230 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.384240 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.384246 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:56Z","lastTransitionTime":"2025-10-02T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.486015 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.486057 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.486065 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.486074 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.486081 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:56Z","lastTransitionTime":"2025-10-02T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.587801 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.587835 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.587843 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.587856 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.587865 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:56Z","lastTransitionTime":"2025-10-02T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.689435 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.689465 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.689473 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.689484 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.689492 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:56Z","lastTransitionTime":"2025-10-02T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.791478 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.791509 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.791521 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.791532 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.791542 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:56Z","lastTransitionTime":"2025-10-02T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.893111 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.893142 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.893152 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.893164 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.893173 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:56Z","lastTransitionTime":"2025-10-02T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.994876 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.994906 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.994913 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.994926 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:56 crc kubenswrapper[4786]: I1002 06:47:56.994935 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:56Z","lastTransitionTime":"2025-10-02T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.096824 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.096853 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.096860 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.096869 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.096877 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:57Z","lastTransitionTime":"2025-10-02T06:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.198484 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.198524 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.198533 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.198547 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.198555 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:57Z","lastTransitionTime":"2025-10-02T06:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.300548 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.300583 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.300592 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.300604 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.300612 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:57Z","lastTransitionTime":"2025-10-02T06:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.402222 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.402259 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.402268 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.402279 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.402286 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:57Z","lastTransitionTime":"2025-10-02T06:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.504447 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.504481 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.504489 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.504500 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.504508 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:57Z","lastTransitionTime":"2025-10-02T06:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.606209 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.606249 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.606258 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.606269 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.606278 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:57Z","lastTransitionTime":"2025-10-02T06:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.708008 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.708037 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.708045 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.708054 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.708061 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:57Z","lastTransitionTime":"2025-10-02T06:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.810257 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.810282 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.810291 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.810302 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.810313 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:57Z","lastTransitionTime":"2025-10-02T06:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.911875 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.911907 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.911916 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.911926 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:57 crc kubenswrapper[4786]: I1002 06:47:57.911935 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:57Z","lastTransitionTime":"2025-10-02T06:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.013755 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.013815 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.013826 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.013842 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.013851 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:58Z","lastTransitionTime":"2025-10-02T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.115461 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.115491 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.115500 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.115510 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.115518 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:58Z","lastTransitionTime":"2025-10-02T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.178524 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.178553 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.178596 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.178597 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:47:58 crc kubenswrapper[4786]: E1002 06:47:58.178783 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:47:58 crc kubenswrapper[4786]: E1002 06:47:58.178865 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:47:58 crc kubenswrapper[4786]: E1002 06:47:58.178953 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:47:58 crc kubenswrapper[4786]: E1002 06:47:58.179030 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.217369 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.217395 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.217403 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.217414 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.217421 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:58Z","lastTransitionTime":"2025-10-02T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.318725 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.318764 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.318775 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.318788 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.318797 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:58Z","lastTransitionTime":"2025-10-02T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.420253 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.420292 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.420300 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.420310 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.420318 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:58Z","lastTransitionTime":"2025-10-02T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.521563 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.521593 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.521601 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.521612 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.521620 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:58Z","lastTransitionTime":"2025-10-02T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.623617 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.623646 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.623653 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.623665 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.623673 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:58Z","lastTransitionTime":"2025-10-02T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.725742 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.725775 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.725784 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.725796 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.725805 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:58Z","lastTransitionTime":"2025-10-02T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.827333 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.827363 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.827372 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.827384 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.827395 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:58Z","lastTransitionTime":"2025-10-02T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.929262 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.929290 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.929299 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.929309 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:58 crc kubenswrapper[4786]: I1002 06:47:58.929320 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:58Z","lastTransitionTime":"2025-10-02T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.030746 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.030776 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.030785 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.030795 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.030803 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:59Z","lastTransitionTime":"2025-10-02T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.132387 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.132418 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.132426 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.132437 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.132446 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:59Z","lastTransitionTime":"2025-10-02T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.234499 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.234529 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.234536 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.234546 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.234553 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:59Z","lastTransitionTime":"2025-10-02T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.336264 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.336302 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.336314 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.336329 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.336339 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:59Z","lastTransitionTime":"2025-10-02T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.437600 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.437626 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.437634 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.437647 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.437656 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:59Z","lastTransitionTime":"2025-10-02T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.539196 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.539227 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.539236 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.539247 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.539255 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:59Z","lastTransitionTime":"2025-10-02T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.640502 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.640531 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.640540 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.640550 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.640558 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:59Z","lastTransitionTime":"2025-10-02T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.742158 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.742184 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.742192 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.742204 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.742212 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:59Z","lastTransitionTime":"2025-10-02T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.843894 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.843946 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.843956 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.843970 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.843979 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:59Z","lastTransitionTime":"2025-10-02T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.945392 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.945424 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.945432 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.945443 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:47:59 crc kubenswrapper[4786]: I1002 06:47:59.945451 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:47:59Z","lastTransitionTime":"2025-10-02T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.046731 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.046757 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.046765 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.046777 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.046784 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:00Z","lastTransitionTime":"2025-10-02T06:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.148733 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.148768 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.148796 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.148808 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.148816 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:00Z","lastTransitionTime":"2025-10-02T06:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.179144 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.179211 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.179222 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:00 crc kubenswrapper[4786]: E1002 06:48:00.179297 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.179328 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:00 crc kubenswrapper[4786]: E1002 06:48:00.179400 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:48:00 crc kubenswrapper[4786]: E1002 06:48:00.179430 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:48:00 crc kubenswrapper[4786]: E1002 06:48:00.179462 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.198500 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=71.198490222 podStartE2EDuration="1m11.198490222s" podCreationTimestamp="2025-10-02 06:46:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:00.198457671 +0000 UTC m=+90.319640822" watchObservedRunningTime="2025-10-02 06:48:00.198490222 +0000 UTC m=+90.319673353" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.221002 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9h5tj" podStartSLOduration=72.220988407 podStartE2EDuration="1m12.220988407s" podCreationTimestamp="2025-10-02 06:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:00.220548247 +0000 UTC m=+90.341731379" watchObservedRunningTime="2025-10-02 06:48:00.220988407 +0000 UTC m=+90.342171538" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.227683 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=13.227674413999999 podStartE2EDuration="13.227674414s" podCreationTimestamp="2025-10-02 06:47:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:00.227148324 +0000 UTC m=+90.348331455" watchObservedRunningTime="2025-10-02 06:48:00.227674414 +0000 UTC m=+90.348857545" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.250763 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.250770 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7hgkl" podStartSLOduration=73.250758301 podStartE2EDuration="1m13.250758301s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:00.2356377 +0000 UTC m=+90.356820851" watchObservedRunningTime="2025-10-02 06:48:00.250758301 +0000 UTC m=+90.371941431" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.250790 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.250907 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.250937 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.250946 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:00Z","lastTransitionTime":"2025-10-02T06:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.269478 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-2v2ds" podStartSLOduration=73.269463579 podStartE2EDuration="1m13.269463579s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:00.261062748 +0000 UTC m=+90.382245890" watchObservedRunningTime="2025-10-02 06:48:00.269463579 +0000 UTC m=+90.390646710" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.292658 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=41.292643886 podStartE2EDuration="41.292643886s" podCreationTimestamp="2025-10-02 06:47:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:00.28098816 +0000 UTC m=+90.402171311" watchObservedRunningTime="2025-10-02 06:48:00.292643886 +0000 UTC m=+90.413827017" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.316621 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podStartSLOduration=73.316605767 podStartE2EDuration="1m13.316605767s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:00.316353689 +0000 UTC m=+90.437536830" watchObservedRunningTime="2025-10-02 06:48:00.316605767 +0000 UTC m=+90.437788908" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.332064 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-r5k86" podStartSLOduration=73.332045603 podStartE2EDuration="1m13.332045603s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:00.331401343 +0000 UTC m=+90.452584484" watchObservedRunningTime="2025-10-02 06:48:00.332045603 +0000 UTC m=+90.453228734" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.349879 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=72.349867215 podStartE2EDuration="1m12.349867215s" podCreationTimestamp="2025-10-02 06:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:00.349580342 +0000 UTC m=+90.470763473" watchObservedRunningTime="2025-10-02 06:48:00.349867215 +0000 UTC m=+90.471050347" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.352425 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.352456 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.352465 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.352478 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.352486 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:00Z","lastTransitionTime":"2025-10-02T06:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.359385 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=70.359373727 podStartE2EDuration="1m10.359373727s" podCreationTimestamp="2025-10-02 06:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:00.359014108 +0000 UTC m=+90.480197259" watchObservedRunningTime="2025-10-02 06:48:00.359373727 +0000 UTC m=+90.480556859" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.381764 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-g5lv2" podStartSLOduration=73.381752861 podStartE2EDuration="1m13.381752861s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:00.381303755 +0000 UTC m=+90.502486896" watchObservedRunningTime="2025-10-02 06:48:00.381752861 +0000 UTC m=+90.502935992" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.454027 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.454060 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.454069 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.454082 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.454091 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:00Z","lastTransitionTime":"2025-10-02T06:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.555522 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.555553 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.555562 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.555575 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.555584 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:00Z","lastTransitionTime":"2025-10-02T06:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.656928 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.656965 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.656975 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.656988 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.656998 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:00Z","lastTransitionTime":"2025-10-02T06:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.758753 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.758787 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.758797 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.758809 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.758819 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:00Z","lastTransitionTime":"2025-10-02T06:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.860398 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.860428 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.860435 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.860447 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.860457 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:00Z","lastTransitionTime":"2025-10-02T06:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.962178 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.962218 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.962227 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.962237 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:00 crc kubenswrapper[4786]: I1002 06:48:00.962244 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:00Z","lastTransitionTime":"2025-10-02T06:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.064049 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.064078 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.064085 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.064099 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.064107 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:01Z","lastTransitionTime":"2025-10-02T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.165915 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.165950 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.165959 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.165970 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.165979 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:01Z","lastTransitionTime":"2025-10-02T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.267838 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.267871 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.267879 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.267905 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.267913 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:01Z","lastTransitionTime":"2025-10-02T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.369806 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.369850 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.369861 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.369876 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.369900 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:01Z","lastTransitionTime":"2025-10-02T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.471713 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.471745 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.471755 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.471766 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.471775 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:01Z","lastTransitionTime":"2025-10-02T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.573708 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.573743 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.573751 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.573763 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.573775 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:01Z","lastTransitionTime":"2025-10-02T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.674967 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.675015 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.675024 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.675036 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.675044 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:01Z","lastTransitionTime":"2025-10-02T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.776470 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.776497 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.776506 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.776519 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.776527 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:01Z","lastTransitionTime":"2025-10-02T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.878525 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.878557 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.878566 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.878576 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.878583 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:01Z","lastTransitionTime":"2025-10-02T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.980000 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.980027 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.980044 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.980056 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:01 crc kubenswrapper[4786]: I1002 06:48:01.980064 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:01Z","lastTransitionTime":"2025-10-02T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.082338 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.082370 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.082379 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.082390 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.082398 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:02Z","lastTransitionTime":"2025-10-02T06:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.178449 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.178483 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:02 crc kubenswrapper[4786]: E1002 06:48:02.178545 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.178591 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.178621 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:02 crc kubenswrapper[4786]: E1002 06:48:02.178681 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:48:02 crc kubenswrapper[4786]: E1002 06:48:02.178757 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:48:02 crc kubenswrapper[4786]: E1002 06:48:02.178815 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.183992 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.184049 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.184063 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.184076 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.184087 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:02Z","lastTransitionTime":"2025-10-02T06:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.286089 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.286120 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.286130 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.286141 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.286151 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:02Z","lastTransitionTime":"2025-10-02T06:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.298218 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.298241 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.298250 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.298263 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.298271 4786 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T06:48:02Z","lastTransitionTime":"2025-10-02T06:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.322710 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8"] Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.323005 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.324725 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.324800 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.324845 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.324808 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.408116 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcaf2345-a42f-4fd3-9aba-e0e6aedecd35-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kvsz8\" (UID: \"dcaf2345-a42f-4fd3-9aba-e0e6aedecd35\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.408153 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcaf2345-a42f-4fd3-9aba-e0e6aedecd35-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kvsz8\" (UID: \"dcaf2345-a42f-4fd3-9aba-e0e6aedecd35\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.408185 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dcaf2345-a42f-4fd3-9aba-e0e6aedecd35-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kvsz8\" (UID: \"dcaf2345-a42f-4fd3-9aba-e0e6aedecd35\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.408213 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dcaf2345-a42f-4fd3-9aba-e0e6aedecd35-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kvsz8\" (UID: \"dcaf2345-a42f-4fd3-9aba-e0e6aedecd35\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.408259 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcaf2345-a42f-4fd3-9aba-e0e6aedecd35-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kvsz8\" (UID: \"dcaf2345-a42f-4fd3-9aba-e0e6aedecd35\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.509254 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcaf2345-a42f-4fd3-9aba-e0e6aedecd35-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kvsz8\" (UID: \"dcaf2345-a42f-4fd3-9aba-e0e6aedecd35\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.509397 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcaf2345-a42f-4fd3-9aba-e0e6aedecd35-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kvsz8\" (UID: \"dcaf2345-a42f-4fd3-9aba-e0e6aedecd35\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.509487 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dcaf2345-a42f-4fd3-9aba-e0e6aedecd35-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kvsz8\" (UID: \"dcaf2345-a42f-4fd3-9aba-e0e6aedecd35\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.509567 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcaf2345-a42f-4fd3-9aba-e0e6aedecd35-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kvsz8\" (UID: \"dcaf2345-a42f-4fd3-9aba-e0e6aedecd35\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.509611 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dcaf2345-a42f-4fd3-9aba-e0e6aedecd35-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kvsz8\" (UID: \"dcaf2345-a42f-4fd3-9aba-e0e6aedecd35\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.509682 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dcaf2345-a42f-4fd3-9aba-e0e6aedecd35-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kvsz8\" (UID: \"dcaf2345-a42f-4fd3-9aba-e0e6aedecd35\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.509733 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dcaf2345-a42f-4fd3-9aba-e0e6aedecd35-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kvsz8\" (UID: \"dcaf2345-a42f-4fd3-9aba-e0e6aedecd35\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.510114 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcaf2345-a42f-4fd3-9aba-e0e6aedecd35-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kvsz8\" (UID: \"dcaf2345-a42f-4fd3-9aba-e0e6aedecd35\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.514985 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcaf2345-a42f-4fd3-9aba-e0e6aedecd35-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kvsz8\" (UID: \"dcaf2345-a42f-4fd3-9aba-e0e6aedecd35\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.522759 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dcaf2345-a42f-4fd3-9aba-e0e6aedecd35-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kvsz8\" (UID: \"dcaf2345-a42f-4fd3-9aba-e0e6aedecd35\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" Oct 02 06:48:02 crc kubenswrapper[4786]: I1002 06:48:02.631949 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" Oct 02 06:48:03 crc kubenswrapper[4786]: I1002 06:48:03.178866 4786 scope.go:117] "RemoveContainer" containerID="7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07" Oct 02 06:48:03 crc kubenswrapper[4786]: E1002 06:48:03.179175 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" podUID="894eab78-90cf-4975-aa45-223332e04f5c" Oct 02 06:48:03 crc kubenswrapper[4786]: I1002 06:48:03.475793 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" event={"ID":"dcaf2345-a42f-4fd3-9aba-e0e6aedecd35","Type":"ContainerStarted","Data":"6524a0b92ac4e285fcd0b8c3e238a4d98f679f98e71dc1ce3e9fe64b0dbcb0a0"} Oct 02 06:48:03 crc kubenswrapper[4786]: I1002 06:48:03.476089 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" event={"ID":"dcaf2345-a42f-4fd3-9aba-e0e6aedecd35","Type":"ContainerStarted","Data":"f156317e2f43d16050a82692037f12225898db8b31f4659a46991b1fa016a7b6"} Oct 02 06:48:03 crc kubenswrapper[4786]: I1002 06:48:03.485370 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvsz8" podStartSLOduration=76.485355496 podStartE2EDuration="1m16.485355496s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:03.484881764 +0000 UTC m=+93.606064905" watchObservedRunningTime="2025-10-02 06:48:03.485355496 +0000 UTC m=+93.606538627" Oct 02 06:48:04 crc kubenswrapper[4786]: I1002 06:48:04.178377 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:04 crc kubenswrapper[4786]: I1002 06:48:04.178438 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:04 crc kubenswrapper[4786]: I1002 06:48:04.178457 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:04 crc kubenswrapper[4786]: E1002 06:48:04.178495 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:48:04 crc kubenswrapper[4786]: I1002 06:48:04.178527 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:04 crc kubenswrapper[4786]: E1002 06:48:04.178559 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:48:04 crc kubenswrapper[4786]: E1002 06:48:04.178656 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:48:04 crc kubenswrapper[4786]: E1002 06:48:04.178760 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:48:04 crc kubenswrapper[4786]: I1002 06:48:04.829906 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs\") pod \"network-metrics-daemon-p8zkp\" (UID: \"6e4217c0-9581-4727-b594-adb99293f7db\") " pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:04 crc kubenswrapper[4786]: E1002 06:48:04.830017 4786 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 06:48:04 crc kubenswrapper[4786]: E1002 06:48:04.830069 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs podName:6e4217c0-9581-4727-b594-adb99293f7db nodeName:}" failed. No retries permitted until 2025-10-02 06:49:08.830055892 +0000 UTC m=+158.951239023 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs") pod "network-metrics-daemon-p8zkp" (UID: "6e4217c0-9581-4727-b594-adb99293f7db") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 06:48:06 crc kubenswrapper[4786]: I1002 06:48:06.178316 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:06 crc kubenswrapper[4786]: I1002 06:48:06.178351 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:06 crc kubenswrapper[4786]: I1002 06:48:06.178370 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:06 crc kubenswrapper[4786]: E1002 06:48:06.178437 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:48:06 crc kubenswrapper[4786]: I1002 06:48:06.178476 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:06 crc kubenswrapper[4786]: E1002 06:48:06.178591 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:48:06 crc kubenswrapper[4786]: E1002 06:48:06.178636 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:48:06 crc kubenswrapper[4786]: E1002 06:48:06.178776 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:48:08 crc kubenswrapper[4786]: I1002 06:48:08.178822 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:08 crc kubenswrapper[4786]: I1002 06:48:08.178905 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:08 crc kubenswrapper[4786]: E1002 06:48:08.178989 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:48:08 crc kubenswrapper[4786]: I1002 06:48:08.179014 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:08 crc kubenswrapper[4786]: E1002 06:48:08.179065 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:48:08 crc kubenswrapper[4786]: I1002 06:48:08.179063 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:08 crc kubenswrapper[4786]: E1002 06:48:08.179146 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:48:08 crc kubenswrapper[4786]: E1002 06:48:08.179212 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:48:10 crc kubenswrapper[4786]: I1002 06:48:10.178223 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:10 crc kubenswrapper[4786]: I1002 06:48:10.178266 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:10 crc kubenswrapper[4786]: E1002 06:48:10.179056 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:48:10 crc kubenswrapper[4786]: I1002 06:48:10.179078 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:10 crc kubenswrapper[4786]: I1002 06:48:10.179091 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:10 crc kubenswrapper[4786]: E1002 06:48:10.179160 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:48:10 crc kubenswrapper[4786]: E1002 06:48:10.179213 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:48:10 crc kubenswrapper[4786]: E1002 06:48:10.179270 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:48:12 crc kubenswrapper[4786]: I1002 06:48:12.179230 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:12 crc kubenswrapper[4786]: I1002 06:48:12.179261 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:12 crc kubenswrapper[4786]: E1002 06:48:12.179324 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:48:12 crc kubenswrapper[4786]: I1002 06:48:12.179349 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:12 crc kubenswrapper[4786]: E1002 06:48:12.179484 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:48:12 crc kubenswrapper[4786]: I1002 06:48:12.179530 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:12 crc kubenswrapper[4786]: E1002 06:48:12.179541 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:48:12 crc kubenswrapper[4786]: E1002 06:48:12.179630 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:48:14 crc kubenswrapper[4786]: I1002 06:48:14.178489 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:14 crc kubenswrapper[4786]: I1002 06:48:14.178512 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:14 crc kubenswrapper[4786]: I1002 06:48:14.178489 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:14 crc kubenswrapper[4786]: I1002 06:48:14.178552 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:14 crc kubenswrapper[4786]: E1002 06:48:14.178596 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:48:14 crc kubenswrapper[4786]: E1002 06:48:14.178653 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:48:14 crc kubenswrapper[4786]: E1002 06:48:14.178727 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:48:14 crc kubenswrapper[4786]: E1002 06:48:14.178791 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:48:16 crc kubenswrapper[4786]: I1002 06:48:16.178673 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:16 crc kubenswrapper[4786]: I1002 06:48:16.178831 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:16 crc kubenswrapper[4786]: I1002 06:48:16.178836 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:16 crc kubenswrapper[4786]: I1002 06:48:16.178928 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:16 crc kubenswrapper[4786]: E1002 06:48:16.178929 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:48:16 crc kubenswrapper[4786]: E1002 06:48:16.179013 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:48:16 crc kubenswrapper[4786]: E1002 06:48:16.179073 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:48:16 crc kubenswrapper[4786]: E1002 06:48:16.179133 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:48:17 crc kubenswrapper[4786]: I1002 06:48:17.179417 4786 scope.go:117] "RemoveContainer" containerID="7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07" Oct 02 06:48:17 crc kubenswrapper[4786]: E1002 06:48:17.179531 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bgs8z_openshift-ovn-kubernetes(894eab78-90cf-4975-aa45-223332e04f5c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" podUID="894eab78-90cf-4975-aa45-223332e04f5c" Oct 02 06:48:18 crc kubenswrapper[4786]: I1002 06:48:18.178342 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:18 crc kubenswrapper[4786]: I1002 06:48:18.178380 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:18 crc kubenswrapper[4786]: I1002 06:48:18.178431 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:18 crc kubenswrapper[4786]: E1002 06:48:18.178431 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:48:18 crc kubenswrapper[4786]: I1002 06:48:18.178483 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:18 crc kubenswrapper[4786]: E1002 06:48:18.178608 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:48:18 crc kubenswrapper[4786]: E1002 06:48:18.178659 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:48:18 crc kubenswrapper[4786]: E1002 06:48:18.178754 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:48:20 crc kubenswrapper[4786]: I1002 06:48:20.178788 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:20 crc kubenswrapper[4786]: I1002 06:48:20.178842 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:20 crc kubenswrapper[4786]: E1002 06:48:20.179536 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:48:20 crc kubenswrapper[4786]: I1002 06:48:20.179566 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:20 crc kubenswrapper[4786]: I1002 06:48:20.179550 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:20 crc kubenswrapper[4786]: E1002 06:48:20.179737 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:48:20 crc kubenswrapper[4786]: E1002 06:48:20.179793 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:48:20 crc kubenswrapper[4786]: E1002 06:48:20.179886 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:48:20 crc kubenswrapper[4786]: I1002 06:48:20.511820 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7hgkl_de8dcd53-84d9-422e-8f18-63ea8ea75bd2/kube-multus/1.log" Oct 02 06:48:20 crc kubenswrapper[4786]: I1002 06:48:20.512172 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7hgkl_de8dcd53-84d9-422e-8f18-63ea8ea75bd2/kube-multus/0.log" Oct 02 06:48:20 crc kubenswrapper[4786]: I1002 06:48:20.512210 4786 generic.go:334] "Generic (PLEG): container finished" podID="de8dcd53-84d9-422e-8f18-63ea8ea75bd2" containerID="8b5248bee859340a1496284f12f5f40320ec7bd7f0d299f47b29eed26204d6fc" exitCode=1 Oct 02 06:48:20 crc kubenswrapper[4786]: I1002 06:48:20.512236 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7hgkl" event={"ID":"de8dcd53-84d9-422e-8f18-63ea8ea75bd2","Type":"ContainerDied","Data":"8b5248bee859340a1496284f12f5f40320ec7bd7f0d299f47b29eed26204d6fc"} Oct 02 06:48:20 crc kubenswrapper[4786]: I1002 06:48:20.512263 4786 scope.go:117] "RemoveContainer" containerID="5977923cfe5ff9940bf321ea70c6a569af0a45b8b5d9eea56ea46219222a6015" Oct 02 06:48:20 crc kubenswrapper[4786]: I1002 06:48:20.512574 4786 scope.go:117] "RemoveContainer" containerID="8b5248bee859340a1496284f12f5f40320ec7bd7f0d299f47b29eed26204d6fc" Oct 02 06:48:20 crc kubenswrapper[4786]: E1002 06:48:20.512717 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-7hgkl_openshift-multus(de8dcd53-84d9-422e-8f18-63ea8ea75bd2)\"" pod="openshift-multus/multus-7hgkl" podUID="de8dcd53-84d9-422e-8f18-63ea8ea75bd2" Oct 02 06:48:21 crc kubenswrapper[4786]: I1002 06:48:21.515467 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7hgkl_de8dcd53-84d9-422e-8f18-63ea8ea75bd2/kube-multus/1.log" Oct 02 06:48:22 crc kubenswrapper[4786]: I1002 06:48:22.178602 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:22 crc kubenswrapper[4786]: I1002 06:48:22.178866 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:22 crc kubenswrapper[4786]: E1002 06:48:22.178944 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:48:22 crc kubenswrapper[4786]: I1002 06:48:22.178974 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:22 crc kubenswrapper[4786]: I1002 06:48:22.178989 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:22 crc kubenswrapper[4786]: E1002 06:48:22.179062 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:48:22 crc kubenswrapper[4786]: E1002 06:48:22.179146 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:48:22 crc kubenswrapper[4786]: E1002 06:48:22.179177 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:48:24 crc kubenswrapper[4786]: I1002 06:48:24.178542 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:24 crc kubenswrapper[4786]: I1002 06:48:24.178603 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:24 crc kubenswrapper[4786]: E1002 06:48:24.178641 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:48:24 crc kubenswrapper[4786]: E1002 06:48:24.178719 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:48:24 crc kubenswrapper[4786]: I1002 06:48:24.178781 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:24 crc kubenswrapper[4786]: E1002 06:48:24.178826 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:48:24 crc kubenswrapper[4786]: I1002 06:48:24.178894 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:24 crc kubenswrapper[4786]: E1002 06:48:24.178980 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:48:26 crc kubenswrapper[4786]: I1002 06:48:26.178602 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:26 crc kubenswrapper[4786]: E1002 06:48:26.178726 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:48:26 crc kubenswrapper[4786]: I1002 06:48:26.178770 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:26 crc kubenswrapper[4786]: I1002 06:48:26.178981 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:26 crc kubenswrapper[4786]: E1002 06:48:26.179053 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:48:26 crc kubenswrapper[4786]: E1002 06:48:26.179096 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:48:26 crc kubenswrapper[4786]: I1002 06:48:26.179168 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:26 crc kubenswrapper[4786]: E1002 06:48:26.179227 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:48:28 crc kubenswrapper[4786]: I1002 06:48:28.178325 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:28 crc kubenswrapper[4786]: I1002 06:48:28.178367 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:28 crc kubenswrapper[4786]: I1002 06:48:28.178367 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:28 crc kubenswrapper[4786]: E1002 06:48:28.178420 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:48:28 crc kubenswrapper[4786]: I1002 06:48:28.178332 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:28 crc kubenswrapper[4786]: E1002 06:48:28.178504 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:48:28 crc kubenswrapper[4786]: E1002 06:48:28.178614 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:48:28 crc kubenswrapper[4786]: E1002 06:48:28.178640 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:48:30 crc kubenswrapper[4786]: E1002 06:48:30.168360 4786 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 02 06:48:30 crc kubenswrapper[4786]: I1002 06:48:30.178429 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:30 crc kubenswrapper[4786]: E1002 06:48:30.179687 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:48:30 crc kubenswrapper[4786]: I1002 06:48:30.179738 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:30 crc kubenswrapper[4786]: I1002 06:48:30.179755 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:30 crc kubenswrapper[4786]: I1002 06:48:30.179765 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:30 crc kubenswrapper[4786]: E1002 06:48:30.180055 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:48:30 crc kubenswrapper[4786]: E1002 06:48:30.180122 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:48:30 crc kubenswrapper[4786]: I1002 06:48:30.180172 4786 scope.go:117] "RemoveContainer" containerID="7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07" Oct 02 06:48:30 crc kubenswrapper[4786]: E1002 06:48:30.180188 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:48:30 crc kubenswrapper[4786]: E1002 06:48:30.257675 4786 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 06:48:30 crc kubenswrapper[4786]: I1002 06:48:30.533410 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bgs8z_894eab78-90cf-4975-aa45-223332e04f5c/ovnkube-controller/3.log" Oct 02 06:48:30 crc kubenswrapper[4786]: I1002 06:48:30.535176 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerStarted","Data":"c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80"} Oct 02 06:48:30 crc kubenswrapper[4786]: I1002 06:48:30.535447 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:48:30 crc kubenswrapper[4786]: I1002 06:48:30.554563 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" podStartSLOduration=103.554553443 podStartE2EDuration="1m43.554553443s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:30.55377829 +0000 UTC m=+120.674961431" watchObservedRunningTime="2025-10-02 06:48:30.554553443 +0000 UTC m=+120.675736574" Oct 02 06:48:30 crc kubenswrapper[4786]: I1002 06:48:30.762369 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p8zkp"] Oct 02 06:48:30 crc kubenswrapper[4786]: I1002 06:48:30.762635 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:30 crc kubenswrapper[4786]: E1002 06:48:30.762727 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:48:32 crc kubenswrapper[4786]: I1002 06:48:32.178419 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:32 crc kubenswrapper[4786]: I1002 06:48:32.178534 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:32 crc kubenswrapper[4786]: I1002 06:48:32.178666 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:32 crc kubenswrapper[4786]: E1002 06:48:32.178663 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:48:32 crc kubenswrapper[4786]: I1002 06:48:32.178723 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:32 crc kubenswrapper[4786]: E1002 06:48:32.178900 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:48:32 crc kubenswrapper[4786]: I1002 06:48:32.179012 4786 scope.go:117] "RemoveContainer" containerID="8b5248bee859340a1496284f12f5f40320ec7bd7f0d299f47b29eed26204d6fc" Oct 02 06:48:32 crc kubenswrapper[4786]: E1002 06:48:32.179162 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:48:32 crc kubenswrapper[4786]: E1002 06:48:32.179232 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:48:32 crc kubenswrapper[4786]: I1002 06:48:32.541622 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7hgkl_de8dcd53-84d9-422e-8f18-63ea8ea75bd2/kube-multus/1.log" Oct 02 06:48:32 crc kubenswrapper[4786]: I1002 06:48:32.541670 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7hgkl" event={"ID":"de8dcd53-84d9-422e-8f18-63ea8ea75bd2","Type":"ContainerStarted","Data":"07b92f0dcfb6f4e4c5ff46665fe3986acd98cd8a36fe13f123938e8010eb7928"} Oct 02 06:48:34 crc kubenswrapper[4786]: I1002 06:48:34.179042 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:34 crc kubenswrapper[4786]: E1002 06:48:34.179428 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 06:48:34 crc kubenswrapper[4786]: I1002 06:48:34.179125 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:34 crc kubenswrapper[4786]: E1002 06:48:34.179493 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 06:48:34 crc kubenswrapper[4786]: I1002 06:48:34.179190 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:34 crc kubenswrapper[4786]: E1002 06:48:34.179539 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 06:48:34 crc kubenswrapper[4786]: I1002 06:48:34.179113 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:34 crc kubenswrapper[4786]: E1002 06:48:34.179607 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p8zkp" podUID="6e4217c0-9581-4727-b594-adb99293f7db" Oct 02 06:48:36 crc kubenswrapper[4786]: I1002 06:48:36.179413 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:36 crc kubenswrapper[4786]: I1002 06:48:36.179413 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:36 crc kubenswrapper[4786]: I1002 06:48:36.179479 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:36 crc kubenswrapper[4786]: I1002 06:48:36.179637 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:48:36 crc kubenswrapper[4786]: I1002 06:48:36.181793 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 02 06:48:36 crc kubenswrapper[4786]: I1002 06:48:36.182319 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 02 06:48:36 crc kubenswrapper[4786]: I1002 06:48:36.182351 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 02 06:48:36 crc kubenswrapper[4786]: I1002 06:48:36.183411 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 02 06:48:36 crc kubenswrapper[4786]: I1002 06:48:36.183781 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 02 06:48:36 crc kubenswrapper[4786]: I1002 06:48:36.183949 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.332612 4786 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.360141 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbfpr"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.360450 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbfpr" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.362876 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.362937 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.362993 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.362876 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.363946 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-cwznr"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.364224 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r9ltq"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.364541 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r9ltq" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.364559 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nxvrf"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.364823 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.374433 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.374434 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.374525 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.378216 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.378557 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.378906 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-97p7p"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.379236 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-97p7p" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.379306 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.379415 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.379546 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.379613 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.379703 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-txfb6"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.379705 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.379897 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.379947 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.379987 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.380005 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.380392 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5f7p6"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.380916 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.381401 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9j6b"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.381631 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-97s7j"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.381876 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-97s7j" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.382076 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9j6b" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.382338 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-66zdg"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.382607 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.385891 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.386132 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.386237 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.386641 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.386855 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jrlfb"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.387125 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jrlfb" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.388897 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wpphq"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.389585 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpphq" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.389908 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.390361 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.392263 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fbbcv"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.392588 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-c2542"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.392923 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9lpsc"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.393032 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2542" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.393263 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9lpsc" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.393428 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fbbcv" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.393596 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-skstx"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.393915 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-skstx" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.395956 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.396836 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.397196 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.397354 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.397374 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.397762 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.397886 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.398199 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.399622 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-ph6hx"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.399673 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.399856 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.399896 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.400062 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.400094 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.400111 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.400166 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.400184 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.400244 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.400256 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.400331 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.400360 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.400427 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.400586 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.400633 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.400671 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.400722 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.401027 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.401144 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.400254 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.401871 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.401935 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.401988 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.402104 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.402192 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.402282 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.401873 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.400332 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.401814 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.402941 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.402997 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.403049 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.403060 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.403105 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.403127 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.403155 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.403185 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.403194 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.403246 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.403264 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.403272 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.403305 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.403326 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.403345 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.400332 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.403378 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.403921 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9zc9c"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.416819 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9zc9c" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.417031 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.417099 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.417141 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.417044 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.417300 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.417465 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.417883 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.418141 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.418846 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.428164 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.428200 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.428252 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.428338 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.428380 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.428381 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-rtc95"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.428490 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.428545 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.429063 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-rtc95" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.429396 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.429585 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.429637 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.430055 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.430257 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-lptgx"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.430282 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.430407 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.430447 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.430859 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lptgx" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.430878 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.430990 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.431051 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dhb25"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.431684 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dhb25" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.432093 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.433234 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.433321 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.433396 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.433825 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jw92k"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.434065 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.434159 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jw92k" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.440219 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.440894 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.441547 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.444573 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-swzbl"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.445103 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pzs4z"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.445384 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5f947"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.445771 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5f947" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.445821 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.445988 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.446138 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.446582 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-fxqfj"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.446678 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.446916 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sgsdc"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.447097 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.447562 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.453703 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.447572 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-sgsdc" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.454754 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.461883 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.468118 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dnz65"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.468593 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.468835 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xw7f8"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.468924 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dnz65" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.469571 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xw7f8" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.470450 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.471063 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.471317 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.471805 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.472762 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-cwznr"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.473800 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r9ltq"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.474675 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbfpr"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.475813 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.476386 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.477706 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-97s7j"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.480010 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9j6b"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.481656 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5f7p6"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.481869 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.482913 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.484000 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nxvrf"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.484770 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-97p7p"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.487617 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.488725 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-cbv9l"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.489928 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cbv9l" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.491166 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2lcqq"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.491850 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2lcqq" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.492892 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jrlfb"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.493685 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.494591 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-66zdg"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.495564 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5f947"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.496720 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dhb25"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.497617 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-c2542"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.498807 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fbbcv"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.500149 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wpphq"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.500722 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.501586 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.501641 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.503439 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9zc9c"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.504485 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-swzbl"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.505960 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.506841 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-txfb6"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.507680 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jw92k"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.508422 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sgsdc"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.509206 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-skstx"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.509998 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9lpsc"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.510958 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ph6hx"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.511715 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.513213 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.514342 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-rtc95"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.515218 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dnz65"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.515996 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xw7f8"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.518291 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.524303 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.527635 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2lcqq"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.529274 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pzs4z"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.530658 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-22wrg"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.531722 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-22wrg" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.532567 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rfmkw"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.534146 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rfmkw"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.534285 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.534383 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-22wrg"] Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.542216 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.543416 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbb0a742-29b1-49e8-b3cc-730f535ab08a-service-ca-bundle\") pod \"authentication-operator-69f744f599-cwznr\" (UID: \"dbb0a742-29b1-49e8-b3cc-730f535ab08a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.543590 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-serving-cert\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.543735 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6647561d-fd2f-4b60-b155-ffbc54b0da4f-serving-cert\") pod \"route-controller-manager-6576b87f9c-6z59q\" (UID: \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.543823 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.543903 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb1154c0-3781-4db0-bb82-dfad5c61316d-config\") pod \"console-operator-58897d9998-97s7j\" (UID: \"eb1154c0-3781-4db0-bb82-dfad5c61316d\") " pod="openshift-console-operator/console-operator-58897d9998-97s7j" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.543980 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6647561d-fd2f-4b60-b155-ffbc54b0da4f-client-ca\") pod \"route-controller-manager-6576b87f9c-6z59q\" (UID: \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.544064 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-audit-policies\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.544149 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fcgh\" (UniqueName: \"kubernetes.io/projected/dbb0a742-29b1-49e8-b3cc-730f535ab08a-kube-api-access-9fcgh\") pod \"authentication-operator-69f744f599-cwznr\" (UID: \"dbb0a742-29b1-49e8-b3cc-730f535ab08a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.544262 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.544343 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44135109-c90d-445e-970f-703ae85719cf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jrlfb\" (UID: \"44135109-c90d-445e-970f-703ae85719cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jrlfb" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.544417 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-serving-cert\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.544485 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb1154c0-3781-4db0-bb82-dfad5c61316d-trusted-ca\") pod \"console-operator-58897d9998-97s7j\" (UID: \"eb1154c0-3781-4db0-bb82-dfad5c61316d\") " pod="openshift-console-operator/console-operator-58897d9998-97s7j" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.544553 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.544613 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb1154c0-3781-4db0-bb82-dfad5c61316d-serving-cert\") pod \"console-operator-58897d9998-97s7j\" (UID: \"eb1154c0-3781-4db0-bb82-dfad5c61316d\") " pod="openshift-console-operator/console-operator-58897d9998-97s7j" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.544677 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbb0a742-29b1-49e8-b3cc-730f535ab08a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-cwznr\" (UID: \"dbb0a742-29b1-49e8-b3cc-730f535ab08a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.544789 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.544865 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9j8t\" (UniqueName: \"kubernetes.io/projected/6647561d-fd2f-4b60-b155-ffbc54b0da4f-kube-api-access-h9j8t\") pod \"route-controller-manager-6576b87f9c-6z59q\" (UID: \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.544942 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-audit-dir\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.545030 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.545103 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm85s\" (UniqueName: \"kubernetes.io/projected/5097f23a-2563-4de3-891b-61b5b2bca8a1-kube-api-access-qm85s\") pod \"cluster-samples-operator-665b6dd947-r9ltq\" (UID: \"5097f23a-2563-4de3-891b-61b5b2bca8a1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r9ltq" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.545625 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-etcd-serving-ca\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.545732 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-etcd-client\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.545815 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.545899 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5097f23a-2563-4de3-891b-61b5b2bca8a1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-r9ltq\" (UID: \"5097f23a-2563-4de3-891b-61b5b2bca8a1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r9ltq" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.545995 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.546082 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbb0a742-29b1-49e8-b3cc-730f535ab08a-serving-cert\") pod \"authentication-operator-69f744f599-cwznr\" (UID: \"dbb0a742-29b1-49e8-b3cc-730f535ab08a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.546166 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a03810cb-3b3d-445c-a2d5-5cee24402323-tmpfs\") pod \"packageserver-d55dfcdfc-tmd49\" (UID: \"a03810cb-3b3d-445c-a2d5-5cee24402323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.546243 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-audit-dir\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.546318 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4291100a-fd7d-40b9-a969-23b459b6079d-proxy-tls\") pod \"machine-config-operator-74547568cd-5bfpz\" (UID: \"4291100a-fd7d-40b9-a969-23b459b6079d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.546392 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6647561d-fd2f-4b60-b155-ffbc54b0da4f-config\") pod \"route-controller-manager-6576b87f9c-6z59q\" (UID: \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.546469 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4mzw\" (UniqueName: \"kubernetes.io/projected/f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4-kube-api-access-n4mzw\") pod \"cluster-image-registry-operator-dc59b4c8b-82wfh\" (UID: \"f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.546588 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.546666 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-image-import-ca\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.546770 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4291100a-fd7d-40b9-a969-23b459b6079d-images\") pod \"machine-config-operator-74547568cd-5bfpz\" (UID: \"4291100a-fd7d-40b9-a969-23b459b6079d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.546844 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nprzk\" (UniqueName: \"kubernetes.io/projected/4291100a-fd7d-40b9-a969-23b459b6079d-kube-api-access-nprzk\") pod \"machine-config-operator-74547568cd-5bfpz\" (UID: \"4291100a-fd7d-40b9-a969-23b459b6079d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.546905 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-audit-dir\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.546980 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-audit\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.547054 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-579ch\" (UniqueName: \"kubernetes.io/projected/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-kube-api-access-579ch\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.547130 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.547202 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.547273 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-etcd-client\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.547337 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6llpj\" (UniqueName: \"kubernetes.io/projected/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-kube-api-access-6llpj\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.547402 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zth59\" (UniqueName: \"kubernetes.io/projected/a03810cb-3b3d-445c-a2d5-5cee24402323-kube-api-access-zth59\") pod \"packageserver-d55dfcdfc-tmd49\" (UID: \"a03810cb-3b3d-445c-a2d5-5cee24402323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.547470 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbb0a742-29b1-49e8-b3cc-730f535ab08a-config\") pod \"authentication-operator-69f744f599-cwznr\" (UID: \"dbb0a742-29b1-49e8-b3cc-730f535ab08a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.547540 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a03810cb-3b3d-445c-a2d5-5cee24402323-apiservice-cert\") pod \"packageserver-d55dfcdfc-tmd49\" (UID: \"a03810cb-3b3d-445c-a2d5-5cee24402323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.547605 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jfdj\" (UniqueName: \"kubernetes.io/projected/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-kube-api-access-9jfdj\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.547676 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-node-pullsecrets\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.547780 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-config\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.547850 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-82wfh\" (UID: \"f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.547921 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.547995 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e0050175-139a-4210-add6-1b7bbe800f27-images\") pod \"machine-api-operator-5694c8668f-97p7p\" (UID: \"e0050175-139a-4210-add6-1b7bbe800f27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-97p7p" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.548065 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-audit-policies\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.548144 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n89h\" (UniqueName: \"kubernetes.io/projected/e0050175-139a-4210-add6-1b7bbe800f27-kube-api-access-8n89h\") pod \"machine-api-operator-5694c8668f-97p7p\" (UID: \"e0050175-139a-4210-add6-1b7bbe800f27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-97p7p" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.548207 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brxfj\" (UniqueName: \"kubernetes.io/projected/eb1154c0-3781-4db0-bb82-dfad5c61316d-kube-api-access-brxfj\") pod \"console-operator-58897d9998-97s7j\" (UID: \"eb1154c0-3781-4db0-bb82-dfad5c61316d\") " pod="openshift-console-operator/console-operator-58897d9998-97s7j" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.548280 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e0050175-139a-4210-add6-1b7bbe800f27-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-97p7p\" (UID: \"e0050175-139a-4210-add6-1b7bbe800f27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-97p7p" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.548351 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44135109-c90d-445e-970f-703ae85719cf-config\") pod \"kube-apiserver-operator-766d6c64bb-jrlfb\" (UID: \"44135109-c90d-445e-970f-703ae85719cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jrlfb" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.548422 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4291100a-fd7d-40b9-a969-23b459b6079d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5bfpz\" (UID: \"4291100a-fd7d-40b9-a969-23b459b6079d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.548504 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-encryption-config\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.548572 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.548660 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-82wfh\" (UID: \"f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.548769 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0050175-139a-4210-add6-1b7bbe800f27-config\") pod \"machine-api-operator-5694c8668f-97p7p\" (UID: \"e0050175-139a-4210-add6-1b7bbe800f27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-97p7p" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.548848 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44135109-c90d-445e-970f-703ae85719cf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jrlfb\" (UID: \"44135109-c90d-445e-970f-703ae85719cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jrlfb" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.548932 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-82wfh\" (UID: \"f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.549010 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-encryption-config\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.549088 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a03810cb-3b3d-445c-a2d5-5cee24402323-webhook-cert\") pod \"packageserver-d55dfcdfc-tmd49\" (UID: \"a03810cb-3b3d-445c-a2d5-5cee24402323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.549165 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.549232 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.561830 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.582133 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.601984 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.621933 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.641874 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650022 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbb0a742-29b1-49e8-b3cc-730f535ab08a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-cwznr\" (UID: \"dbb0a742-29b1-49e8-b3cc-730f535ab08a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650078 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650103 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9j8t\" (UniqueName: \"kubernetes.io/projected/6647561d-fd2f-4b60-b155-ffbc54b0da4f-kube-api-access-h9j8t\") pod \"route-controller-manager-6576b87f9c-6z59q\" (UID: \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650123 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-etcd-serving-ca\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650159 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-audit-dir\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650181 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650197 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm85s\" (UniqueName: \"kubernetes.io/projected/5097f23a-2563-4de3-891b-61b5b2bca8a1-kube-api-access-qm85s\") pod \"cluster-samples-operator-665b6dd947-r9ltq\" (UID: \"5097f23a-2563-4de3-891b-61b5b2bca8a1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r9ltq" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650238 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-etcd-client\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650255 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650271 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5097f23a-2563-4de3-891b-61b5b2bca8a1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-r9ltq\" (UID: \"5097f23a-2563-4de3-891b-61b5b2bca8a1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r9ltq" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650287 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650307 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbb0a742-29b1-49e8-b3cc-730f535ab08a-serving-cert\") pod \"authentication-operator-69f744f599-cwznr\" (UID: \"dbb0a742-29b1-49e8-b3cc-730f535ab08a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650322 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a03810cb-3b3d-445c-a2d5-5cee24402323-tmpfs\") pod \"packageserver-d55dfcdfc-tmd49\" (UID: \"a03810cb-3b3d-445c-a2d5-5cee24402323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650337 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-audit-dir\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650353 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4291100a-fd7d-40b9-a969-23b459b6079d-proxy-tls\") pod \"machine-config-operator-74547568cd-5bfpz\" (UID: \"4291100a-fd7d-40b9-a969-23b459b6079d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650369 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6647561d-fd2f-4b60-b155-ffbc54b0da4f-config\") pod \"route-controller-manager-6576b87f9c-6z59q\" (UID: \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650388 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4mzw\" (UniqueName: \"kubernetes.io/projected/f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4-kube-api-access-n4mzw\") pod \"cluster-image-registry-operator-dc59b4c8b-82wfh\" (UID: \"f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650406 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650432 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-image-import-ca\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650448 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4291100a-fd7d-40b9-a969-23b459b6079d-images\") pod \"machine-config-operator-74547568cd-5bfpz\" (UID: \"4291100a-fd7d-40b9-a969-23b459b6079d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650462 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-audit-dir\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650479 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nprzk\" (UniqueName: \"kubernetes.io/projected/4291100a-fd7d-40b9-a969-23b459b6079d-kube-api-access-nprzk\") pod \"machine-config-operator-74547568cd-5bfpz\" (UID: \"4291100a-fd7d-40b9-a969-23b459b6079d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650493 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-audit\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650509 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-579ch\" (UniqueName: \"kubernetes.io/projected/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-kube-api-access-579ch\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650523 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-etcd-client\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650538 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6llpj\" (UniqueName: \"kubernetes.io/projected/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-kube-api-access-6llpj\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650552 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zth59\" (UniqueName: \"kubernetes.io/projected/a03810cb-3b3d-445c-a2d5-5cee24402323-kube-api-access-zth59\") pod \"packageserver-d55dfcdfc-tmd49\" (UID: \"a03810cb-3b3d-445c-a2d5-5cee24402323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650568 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650583 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650598 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbb0a742-29b1-49e8-b3cc-730f535ab08a-config\") pod \"authentication-operator-69f744f599-cwznr\" (UID: \"dbb0a742-29b1-49e8-b3cc-730f535ab08a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650618 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a03810cb-3b3d-445c-a2d5-5cee24402323-apiservice-cert\") pod \"packageserver-d55dfcdfc-tmd49\" (UID: \"a03810cb-3b3d-445c-a2d5-5cee24402323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650635 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jfdj\" (UniqueName: \"kubernetes.io/projected/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-kube-api-access-9jfdj\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650650 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-config\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650667 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-82wfh\" (UID: \"f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650684 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650713 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-node-pullsecrets\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650729 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e0050175-139a-4210-add6-1b7bbe800f27-images\") pod \"machine-api-operator-5694c8668f-97p7p\" (UID: \"e0050175-139a-4210-add6-1b7bbe800f27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-97p7p" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650745 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-audit-policies\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650762 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n89h\" (UniqueName: \"kubernetes.io/projected/e0050175-139a-4210-add6-1b7bbe800f27-kube-api-access-8n89h\") pod \"machine-api-operator-5694c8668f-97p7p\" (UID: \"e0050175-139a-4210-add6-1b7bbe800f27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-97p7p" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650761 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-etcd-serving-ca\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650774 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbb0a742-29b1-49e8-b3cc-730f535ab08a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-cwznr\" (UID: \"dbb0a742-29b1-49e8-b3cc-730f535ab08a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650778 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brxfj\" (UniqueName: \"kubernetes.io/projected/eb1154c0-3781-4db0-bb82-dfad5c61316d-kube-api-access-brxfj\") pod \"console-operator-58897d9998-97s7j\" (UID: \"eb1154c0-3781-4db0-bb82-dfad5c61316d\") " pod="openshift-console-operator/console-operator-58897d9998-97s7j" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650834 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-audit-dir\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650916 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e0050175-139a-4210-add6-1b7bbe800f27-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-97p7p\" (UID: \"e0050175-139a-4210-add6-1b7bbe800f27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-97p7p" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.650938 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44135109-c90d-445e-970f-703ae85719cf-config\") pod \"kube-apiserver-operator-766d6c64bb-jrlfb\" (UID: \"44135109-c90d-445e-970f-703ae85719cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jrlfb" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.651240 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-node-pullsecrets\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.651333 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbb0a742-29b1-49e8-b3cc-730f535ab08a-config\") pod \"authentication-operator-69f744f599-cwznr\" (UID: \"dbb0a742-29b1-49e8-b3cc-730f535ab08a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.651452 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-audit\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.651504 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-audit-dir\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.651994 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-audit-dir\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.652524 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.652582 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-image-import-ca\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.652769 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e0050175-139a-4210-add6-1b7bbe800f27-images\") pod \"machine-api-operator-5694c8668f-97p7p\" (UID: \"e0050175-139a-4210-add6-1b7bbe800f27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-97p7p" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.653195 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4291100a-fd7d-40b9-a969-23b459b6079d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5bfpz\" (UID: \"4291100a-fd7d-40b9-a969-23b459b6079d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.653203 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-config\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.653259 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.653325 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-encryption-config\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.653391 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.653507 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a03810cb-3b3d-445c-a2d5-5cee24402323-tmpfs\") pod \"packageserver-d55dfcdfc-tmd49\" (UID: \"a03810cb-3b3d-445c-a2d5-5cee24402323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.653522 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44135109-c90d-445e-970f-703ae85719cf-config\") pod \"kube-apiserver-operator-766d6c64bb-jrlfb\" (UID: \"44135109-c90d-445e-970f-703ae85719cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jrlfb" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.653588 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-audit-policies\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.653649 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-82wfh\" (UID: \"f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.653772 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.653825 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0050175-139a-4210-add6-1b7bbe800f27-config\") pod \"machine-api-operator-5694c8668f-97p7p\" (UID: \"e0050175-139a-4210-add6-1b7bbe800f27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-97p7p" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.653901 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44135109-c90d-445e-970f-703ae85719cf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jrlfb\" (UID: \"44135109-c90d-445e-970f-703ae85719cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jrlfb" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.653927 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4291100a-fd7d-40b9-a969-23b459b6079d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5bfpz\" (UID: \"4291100a-fd7d-40b9-a969-23b459b6079d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.653985 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-82wfh\" (UID: \"f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654018 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-encryption-config\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654037 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a03810cb-3b3d-445c-a2d5-5cee24402323-webhook-cert\") pod \"packageserver-d55dfcdfc-tmd49\" (UID: \"a03810cb-3b3d-445c-a2d5-5cee24402323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654038 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6647561d-fd2f-4b60-b155-ffbc54b0da4f-config\") pod \"route-controller-manager-6576b87f9c-6z59q\" (UID: \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654144 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654167 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654186 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbb0a742-29b1-49e8-b3cc-730f535ab08a-service-ca-bundle\") pod \"authentication-operator-69f744f599-cwznr\" (UID: \"dbb0a742-29b1-49e8-b3cc-730f535ab08a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654206 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-serving-cert\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654221 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6647561d-fd2f-4b60-b155-ffbc54b0da4f-serving-cert\") pod \"route-controller-manager-6576b87f9c-6z59q\" (UID: \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654238 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6647561d-fd2f-4b60-b155-ffbc54b0da4f-client-ca\") pod \"route-controller-manager-6576b87f9c-6z59q\" (UID: \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654254 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-audit-policies\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654269 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654283 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb1154c0-3781-4db0-bb82-dfad5c61316d-config\") pod \"console-operator-58897d9998-97s7j\" (UID: \"eb1154c0-3781-4db0-bb82-dfad5c61316d\") " pod="openshift-console-operator/console-operator-58897d9998-97s7j" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654302 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fcgh\" (UniqueName: \"kubernetes.io/projected/dbb0a742-29b1-49e8-b3cc-730f535ab08a-kube-api-access-9fcgh\") pod \"authentication-operator-69f744f599-cwznr\" (UID: \"dbb0a742-29b1-49e8-b3cc-730f535ab08a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654329 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654343 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44135109-c90d-445e-970f-703ae85719cf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jrlfb\" (UID: \"44135109-c90d-445e-970f-703ae85719cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jrlfb" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654358 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-serving-cert\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654373 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb1154c0-3781-4db0-bb82-dfad5c61316d-trusted-ca\") pod \"console-operator-58897d9998-97s7j\" (UID: \"eb1154c0-3781-4db0-bb82-dfad5c61316d\") " pod="openshift-console-operator/console-operator-58897d9998-97s7j" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654378 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0050175-139a-4210-add6-1b7bbe800f27-config\") pod \"machine-api-operator-5694c8668f-97p7p\" (UID: \"e0050175-139a-4210-add6-1b7bbe800f27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-97p7p" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654398 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654415 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb1154c0-3781-4db0-bb82-dfad5c61316d-serving-cert\") pod \"console-operator-58897d9998-97s7j\" (UID: \"eb1154c0-3781-4db0-bb82-dfad5c61316d\") " pod="openshift-console-operator/console-operator-58897d9998-97s7j" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.654572 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-82wfh\" (UID: \"f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.655106 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6647561d-fd2f-4b60-b155-ffbc54b0da4f-client-ca\") pod \"route-controller-manager-6576b87f9c-6z59q\" (UID: \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.655778 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbb0a742-29b1-49e8-b3cc-730f535ab08a-service-ca-bundle\") pod \"authentication-operator-69f744f599-cwznr\" (UID: \"dbb0a742-29b1-49e8-b3cc-730f535ab08a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.655870 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.656017 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb1154c0-3781-4db0-bb82-dfad5c61316d-trusted-ca\") pod \"console-operator-58897d9998-97s7j\" (UID: \"eb1154c0-3781-4db0-bb82-dfad5c61316d\") " pod="openshift-console-operator/console-operator-58897d9998-97s7j" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.656096 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbb0a742-29b1-49e8-b3cc-730f535ab08a-serving-cert\") pod \"authentication-operator-69f744f599-cwznr\" (UID: \"dbb0a742-29b1-49e8-b3cc-730f535ab08a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.656273 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.656335 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-etcd-client\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.656386 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.656673 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.656849 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.657108 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5097f23a-2563-4de3-891b-61b5b2bca8a1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-r9ltq\" (UID: \"5097f23a-2563-4de3-891b-61b5b2bca8a1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r9ltq" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.657241 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb1154c0-3781-4db0-bb82-dfad5c61316d-config\") pod \"console-operator-58897d9998-97s7j\" (UID: \"eb1154c0-3781-4db0-bb82-dfad5c61316d\") " pod="openshift-console-operator/console-operator-58897d9998-97s7j" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.657417 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-etcd-client\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.657649 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e0050175-139a-4210-add6-1b7bbe800f27-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-97p7p\" (UID: \"e0050175-139a-4210-add6-1b7bbe800f27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-97p7p" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.657723 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-audit-policies\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.657872 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-encryption-config\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.657947 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.658074 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.658104 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44135109-c90d-445e-970f-703ae85719cf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jrlfb\" (UID: \"44135109-c90d-445e-970f-703ae85719cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jrlfb" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.658744 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb1154c0-3781-4db0-bb82-dfad5c61316d-serving-cert\") pod \"console-operator-58897d9998-97s7j\" (UID: \"eb1154c0-3781-4db0-bb82-dfad5c61316d\") " pod="openshift-console-operator/console-operator-58897d9998-97s7j" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.659061 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.659120 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.659398 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.659492 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.659501 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-serving-cert\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.659798 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-encryption-config\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.660202 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6647561d-fd2f-4b60-b155-ffbc54b0da4f-serving-cert\") pod \"route-controller-manager-6576b87f9c-6z59q\" (UID: \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.660826 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-serving-cert\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.662382 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.682804 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.687558 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a03810cb-3b3d-445c-a2d5-5cee24402323-webhook-cert\") pod \"packageserver-d55dfcdfc-tmd49\" (UID: \"a03810cb-3b3d-445c-a2d5-5cee24402323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.694193 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a03810cb-3b3d-445c-a2d5-5cee24402323-apiservice-cert\") pod \"packageserver-d55dfcdfc-tmd49\" (UID: \"a03810cb-3b3d-445c-a2d5-5cee24402323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.702640 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.722250 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.742278 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.762560 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.783255 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.808403 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.822474 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.842085 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.861979 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.882668 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.922385 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.942203 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.962479 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 02 06:48:43 crc kubenswrapper[4786]: I1002 06:48:43.981863 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.002609 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.022100 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.031674 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/4291100a-fd7d-40b9-a969-23b459b6079d-images\") pod \"machine-config-operator-74547568cd-5bfpz\" (UID: \"4291100a-fd7d-40b9-a969-23b459b6079d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.043257 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.061804 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.066616 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4291100a-fd7d-40b9-a969-23b459b6079d-proxy-tls\") pod \"machine-config-operator-74547568cd-5bfpz\" (UID: \"4291100a-fd7d-40b9-a969-23b459b6079d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.082832 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.102592 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.122895 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.142864 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.162433 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.185718 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.202380 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.222347 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.241645 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.254209 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-82wfh\" (UID: \"f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.262303 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.281933 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.302245 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.322465 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.342407 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.362348 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.401748 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.422020 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.441766 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.461110 4786 request.go:700] Waited for 1.014796395s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/secrets?fieldSelector=metadata.name%3Dservice-ca-operator-dockercfg-rg9jl&limit=500&resourceVersion=0 Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.462090 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.483343 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.501862 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.522348 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.542286 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.562484 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.587204 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.602379 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.621858 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.642206 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.662487 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.681712 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.701958 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.721757 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.742196 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.762577 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.782445 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.802016 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.822600 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.841859 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.861989 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.882115 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.902506 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.921782 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.942045 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.963261 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 02 06:48:44 crc kubenswrapper[4786]: I1002 06:48:44.981860 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.002383 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.023484 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.042657 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.063201 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.082732 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.102472 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.122367 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.142464 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.161653 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.181785 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.202257 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.222435 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.241656 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.261651 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.282170 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.303097 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.323067 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.342433 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.362012 4786 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.382645 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.402190 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.434965 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brxfj\" (UniqueName: \"kubernetes.io/projected/eb1154c0-3781-4db0-bb82-dfad5c61316d-kube-api-access-brxfj\") pod \"console-operator-58897d9998-97s7j\" (UID: \"eb1154c0-3781-4db0-bb82-dfad5c61316d\") " pod="openshift-console-operator/console-operator-58897d9998-97s7j" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.454098 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nprzk\" (UniqueName: \"kubernetes.io/projected/4291100a-fd7d-40b9-a969-23b459b6079d-kube-api-access-nprzk\") pod \"machine-config-operator-74547568cd-5bfpz\" (UID: \"4291100a-fd7d-40b9-a969-23b459b6079d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.461748 4786 request.go:700] Waited for 1.810737846s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/serviceaccounts/cluster-samples-operator/token Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.474305 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm85s\" (UniqueName: \"kubernetes.io/projected/5097f23a-2563-4de3-891b-61b5b2bca8a1-kube-api-access-qm85s\") pod \"cluster-samples-operator-665b6dd947-r9ltq\" (UID: \"5097f23a-2563-4de3-891b-61b5b2bca8a1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r9ltq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.492738 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r9ltq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.493828 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6llpj\" (UniqueName: \"kubernetes.io/projected/4f593c86-36f4-4f67-a149-bc8ad9dacfa9-kube-api-access-6llpj\") pod \"apiserver-7bbb656c7d-gbqpn\" (UID: \"4f593c86-36f4-4f67-a149-bc8ad9dacfa9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.512817 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zth59\" (UniqueName: \"kubernetes.io/projected/a03810cb-3b3d-445c-a2d5-5cee24402323-kube-api-access-zth59\") pod \"packageserver-d55dfcdfc-tmd49\" (UID: \"a03810cb-3b3d-445c-a2d5-5cee24402323\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.534557 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-579ch\" (UniqueName: \"kubernetes.io/projected/1d9c5c0a-e100-4718-8885-ddf84b4f4d03-kube-api-access-579ch\") pod \"apiserver-76f77b778f-5f7p6\" (UID: \"1d9c5c0a-e100-4718-8885-ddf84b4f4d03\") " pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.556241 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4mzw\" (UniqueName: \"kubernetes.io/projected/f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4-kube-api-access-n4mzw\") pod \"cluster-image-registry-operator-dc59b4c8b-82wfh\" (UID: \"f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.576648 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n89h\" (UniqueName: \"kubernetes.io/projected/e0050175-139a-4210-add6-1b7bbe800f27-kube-api-access-8n89h\") pod \"machine-api-operator-5694c8668f-97p7p\" (UID: \"e0050175-139a-4210-add6-1b7bbe800f27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-97p7p" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.585178 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.589990 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-97s7j" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.593568 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jfdj\" (UniqueName: \"kubernetes.io/projected/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-kube-api-access-9jfdj\") pod \"oauth-openshift-558db77b4-txfb6\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.609336 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.619303 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9j8t\" (UniqueName: \"kubernetes.io/projected/6647561d-fd2f-4b60-b155-ffbc54b0da4f-kube-api-access-h9j8t\") pod \"route-controller-manager-6576b87f9c-6z59q\" (UID: \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.628807 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r9ltq"] Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.633317 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-82wfh\" (UID: \"f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.653398 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.659319 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fcgh\" (UniqueName: \"kubernetes.io/projected/dbb0a742-29b1-49e8-b3cc-730f535ab08a-kube-api-access-9fcgh\") pod \"authentication-operator-69f744f599-cwznr\" (UID: \"dbb0a742-29b1-49e8-b3cc-730f535ab08a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.671465 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.677746 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/44135109-c90d-445e-970f-703ae85719cf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jrlfb\" (UID: \"44135109-c90d-445e-970f-703ae85719cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jrlfb" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.688959 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.780644 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn"] Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.782461 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-console-config\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.782501 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-oauth-serving-cert\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.782526 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e3901cf8-f320-4105-a138-bebbe3ebaa60-machine-approver-tls\") pod \"machine-approver-56656f9798-lptgx\" (UID: \"e3901cf8-f320-4105-a138-bebbe3ebaa60\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lptgx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.782542 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nfwp\" (UniqueName: \"kubernetes.io/projected/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-kube-api-access-7nfwp\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.782561 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e61739fd-b2a8-4bca-a8a8-caf1260a42bb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-k9j6b\" (UID: \"e61739fd-b2a8-4bca-a8a8-caf1260a42bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9j6b" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.782581 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8939ecee-fc8b-415c-b820-b1dd56984b4f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-c2542\" (UID: \"8939ecee-fc8b-415c-b820-b1dd56984b4f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2542" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.782608 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-config\") pod \"controller-manager-879f6c89f-nxvrf\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.782625 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cv8c\" (UniqueName: \"kubernetes.io/projected/e3901cf8-f320-4105-a138-bebbe3ebaa60-kube-api-access-8cv8c\") pod \"machine-approver-56656f9798-lptgx\" (UID: \"e3901cf8-f320-4105-a138-bebbe3ebaa60\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lptgx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.782651 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npzvb\" (UniqueName: \"kubernetes.io/projected/38b8798f-7843-48e8-b3ae-d0124362b72c-kube-api-access-npzvb\") pod \"migrator-59844c95c7-fbbcv\" (UID: \"38b8798f-7843-48e8-b3ae-d0124362b72c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fbbcv" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.782674 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.782731 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8939ecee-fc8b-415c-b820-b1dd56984b4f-proxy-tls\") pod \"machine-config-controller-84d6567774-c2542\" (UID: \"8939ecee-fc8b-415c-b820-b1dd56984b4f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2542" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.782764 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wpphq\" (UID: \"fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpphq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.782792 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2sdk\" (UniqueName: \"kubernetes.io/projected/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-kube-api-access-l2sdk\") pod \"controller-manager-879f6c89f-nxvrf\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.782876 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d280e549-70ac-41b1-b9ac-655328a6b6a6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lbfpr\" (UID: \"d280e549-70ac-41b1-b9ac-655328a6b6a6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbfpr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.782893 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-console-serving-cert\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: E1002 06:48:45.782940 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:46.282928394 +0000 UTC m=+136.404111526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.782977 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6xg8\" (UniqueName: \"kubernetes.io/projected/9d377e9f-0505-4ad1-8d14-21052dd36255-kube-api-access-q6xg8\") pod \"ingress-operator-5b745b69d9-7lklr\" (UID: \"9d377e9f-0505-4ad1-8d14-21052dd36255\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.782996 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3901cf8-f320-4105-a138-bebbe3ebaa60-auth-proxy-config\") pod \"machine-approver-56656f9798-lptgx\" (UID: \"e3901cf8-f320-4105-a138-bebbe3ebaa60\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lptgx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783012 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d377e9f-0505-4ad1-8d14-21052dd36255-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7lklr\" (UID: \"9d377e9f-0505-4ad1-8d14-21052dd36255\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783030 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-trusted-ca\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783045 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-console-oauth-config\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783063 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzvl9\" (UniqueName: \"kubernetes.io/projected/4a4be49b-2638-4ce3-b349-ee4162f13f50-kube-api-access-dzvl9\") pod \"openshift-controller-manager-operator-756b6f6bc6-jw92k\" (UID: \"4a4be49b-2638-4ce3-b349-ee4162f13f50\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jw92k" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783091 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783109 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbgvk\" (UniqueName: \"kubernetes.io/projected/fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba-kube-api-access-gbgvk\") pod \"openshift-config-operator-7777fb866f-wpphq\" (UID: \"fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpphq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783142 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-registry-tls\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783154 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3901cf8-f320-4105-a138-bebbe3ebaa60-config\") pod \"machine-approver-56656f9798-lptgx\" (UID: \"e3901cf8-f320-4105-a138-bebbe3ebaa60\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lptgx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783169 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba-serving-cert\") pod \"openshift-config-operator-7777fb866f-wpphq\" (UID: \"fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpphq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783184 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d280e549-70ac-41b1-b9ac-655328a6b6a6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lbfpr\" (UID: \"d280e549-70ac-41b1-b9ac-655328a6b6a6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbfpr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783197 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be77049a-4c1d-4997-9cf5-62578b78fe6a-metrics-tls\") pod \"dns-operator-744455d44c-9lpsc\" (UID: \"be77049a-4c1d-4997-9cf5-62578b78fe6a\") " pod="openshift-dns-operator/dns-operator-744455d44c-9lpsc" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783236 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f33e24f-2161-4101-b5bf-ca094d98505b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9zc9c\" (UID: \"3f33e24f-2161-4101-b5bf-ca094d98505b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9zc9c" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783267 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntt7w\" (UniqueName: \"kubernetes.io/projected/27f8eb1a-04be-4c48-b96c-c6c928e1b9d1-kube-api-access-ntt7w\") pod \"downloads-7954f5f757-rtc95\" (UID: \"27f8eb1a-04be-4c48-b96c-c6c928e1b9d1\") " pod="openshift-console/downloads-7954f5f757-rtc95" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783285 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d377e9f-0505-4ad1-8d14-21052dd36255-trusted-ca\") pod \"ingress-operator-5b745b69d9-7lklr\" (UID: \"9d377e9f-0505-4ad1-8d14-21052dd36255\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783311 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f33e24f-2161-4101-b5bf-ca094d98505b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9zc9c\" (UID: \"3f33e24f-2161-4101-b5bf-ca094d98505b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9zc9c" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783326 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-client-ca\") pod \"controller-manager-879f6c89f-nxvrf\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783341 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-serving-cert\") pod \"controller-manager-879f6c89f-nxvrf\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783356 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a4be49b-2638-4ce3-b349-ee4162f13f50-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jw92k\" (UID: \"4a4be49b-2638-4ce3-b349-ee4162f13f50\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jw92k" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783374 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d9450d4-2699-4955-84e2-3fb54a67b27e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-skstx\" (UID: \"9d9450d4-2699-4955-84e2-3fb54a67b27e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-skstx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783396 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-registry-certificates\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783409 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-trusted-ca-bundle\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783423 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-service-ca\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783446 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-bound-sa-token\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783462 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj4p5\" (UniqueName: \"kubernetes.io/projected/63894f70-6430-4f05-b281-723de4b91274-kube-api-access-zj4p5\") pod \"multus-admission-controller-857f4d67dd-dhb25\" (UID: \"63894f70-6430-4f05-b281-723de4b91274\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dhb25" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783477 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d9450d4-2699-4955-84e2-3fb54a67b27e-config\") pod \"kube-controller-manager-operator-78b949d7b-skstx\" (UID: \"9d9450d4-2699-4955-84e2-3fb54a67b27e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-skstx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783492 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a4be49b-2638-4ce3-b349-ee4162f13f50-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jw92k\" (UID: \"4a4be49b-2638-4ce3-b349-ee4162f13f50\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jw92k" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783508 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h2mz\" (UniqueName: \"kubernetes.io/projected/be77049a-4c1d-4997-9cf5-62578b78fe6a-kube-api-access-7h2mz\") pod \"dns-operator-744455d44c-9lpsc\" (UID: \"be77049a-4c1d-4997-9cf5-62578b78fe6a\") " pod="openshift-dns-operator/dns-operator-744455d44c-9lpsc" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783531 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4snl9\" (UniqueName: \"kubernetes.io/projected/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-kube-api-access-4snl9\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783544 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f33e24f-2161-4101-b5bf-ca094d98505b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9zc9c\" (UID: \"3f33e24f-2161-4101-b5bf-ca094d98505b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9zc9c" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783565 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783583 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nxvrf\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783598 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61739fd-b2a8-4bca-a8a8-caf1260a42bb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-k9j6b\" (UID: \"e61739fd-b2a8-4bca-a8a8-caf1260a42bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9j6b" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783612 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/63894f70-6430-4f05-b281-723de4b91274-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dhb25\" (UID: \"63894f70-6430-4f05-b281-723de4b91274\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dhb25" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783628 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r85cf\" (UniqueName: \"kubernetes.io/projected/d280e549-70ac-41b1-b9ac-655328a6b6a6-kube-api-access-r85cf\") pod \"openshift-apiserver-operator-796bbdcf4f-lbfpr\" (UID: \"d280e549-70ac-41b1-b9ac-655328a6b6a6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbfpr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783715 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h65x2\" (UniqueName: \"kubernetes.io/projected/8939ecee-fc8b-415c-b820-b1dd56984b4f-kube-api-access-h65x2\") pod \"machine-config-controller-84d6567774-c2542\" (UID: \"8939ecee-fc8b-415c-b820-b1dd56984b4f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2542" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783781 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmv2s\" (UniqueName: \"kubernetes.io/projected/e61739fd-b2a8-4bca-a8a8-caf1260a42bb-kube-api-access-tmv2s\") pod \"kube-storage-version-migrator-operator-b67b599dd-k9j6b\" (UID: \"e61739fd-b2a8-4bca-a8a8-caf1260a42bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9j6b" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783800 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d377e9f-0505-4ad1-8d14-21052dd36255-metrics-tls\") pod \"ingress-operator-5b745b69d9-7lklr\" (UID: \"9d377e9f-0505-4ad1-8d14-21052dd36255\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.783822 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d9450d4-2699-4955-84e2-3fb54a67b27e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-skstx\" (UID: \"9d9450d4-2699-4955-84e2-3fb54a67b27e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-skstx" Oct 02 06:48:45 crc kubenswrapper[4786]: W1002 06:48:45.791352 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f593c86_36f4_4f67_a149_bc8ad9dacfa9.slice/crio-a5a44b9eda982a32f646e3a6618c650448003f32d6c8e25bd4a243a6706a46c5 WatchSource:0}: Error finding container a5a44b9eda982a32f646e3a6618c650448003f32d6c8e25bd4a243a6706a46c5: Status 404 returned error can't find the container with id a5a44b9eda982a32f646e3a6618c650448003f32d6c8e25bd4a243a6706a46c5 Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.803246 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.816198 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49"] Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.835469 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz"] Oct 02 06:48:45 crc kubenswrapper[4786]: W1002 06:48:45.840785 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4291100a_fd7d_40b9_a969_23b459b6079d.slice/crio-aeb7a329b28cfba9fa0896067f7a7a231e90b95ab5f8b7cac36d1c7e103bb8bd WatchSource:0}: Error finding container aeb7a329b28cfba9fa0896067f7a7a231e90b95ab5f8b7cac36d1c7e103bb8bd: Status 404 returned error can't find the container with id aeb7a329b28cfba9fa0896067f7a7a231e90b95ab5f8b7cac36d1c7e103bb8bd Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.854069 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh"] Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.858492 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.873966 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-97p7p" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.879600 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.884396 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:45 crc kubenswrapper[4786]: E1002 06:48:45.884571 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:46.384551011 +0000 UTC m=+136.505734142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.884650 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-console-config\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.884671 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-oauth-serving-cert\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.884806 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e3901cf8-f320-4105-a138-bebbe3ebaa60-machine-approver-tls\") pod \"machine-approver-56656f9798-lptgx\" (UID: \"e3901cf8-f320-4105-a138-bebbe3ebaa60\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lptgx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.884838 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nfwp\" (UniqueName: \"kubernetes.io/projected/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-kube-api-access-7nfwp\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.884865 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e61739fd-b2a8-4bca-a8a8-caf1260a42bb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-k9j6b\" (UID: \"e61739fd-b2a8-4bca-a8a8-caf1260a42bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9j6b" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.884880 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8939ecee-fc8b-415c-b820-b1dd56984b4f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-c2542\" (UID: \"8939ecee-fc8b-415c-b820-b1dd56984b4f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2542" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.884898 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/45186430-df82-4c0f-aaf7-fc032b808f34-mountpoint-dir\") pod \"csi-hostpathplugin-rfmkw\" (UID: \"45186430-df82-4c0f-aaf7-fc032b808f34\") " pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.884916 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-config\") pod \"controller-manager-879f6c89f-nxvrf\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.884931 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cv8c\" (UniqueName: \"kubernetes.io/projected/e3901cf8-f320-4105-a138-bebbe3ebaa60-kube-api-access-8cv8c\") pod \"machine-approver-56656f9798-lptgx\" (UID: \"e3901cf8-f320-4105-a138-bebbe3ebaa60\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lptgx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.884955 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b276f4e7-8b50-4b3e-8b8b-989ddb05bca8-srv-cert\") pod \"olm-operator-6b444d44fb-v9ndb\" (UID: \"b276f4e7-8b50-4b3e-8b8b-989ddb05bca8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.884969 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b276f4e7-8b50-4b3e-8b8b-989ddb05bca8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-v9ndb\" (UID: \"b276f4e7-8b50-4b3e-8b8b-989ddb05bca8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.884997 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npzvb\" (UniqueName: \"kubernetes.io/projected/38b8798f-7843-48e8-b3ae-d0124362b72c-kube-api-access-npzvb\") pod \"migrator-59844c95c7-fbbcv\" (UID: \"38b8798f-7843-48e8-b3ae-d0124362b72c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fbbcv" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.885040 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.885056 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8939ecee-fc8b-415c-b820-b1dd56984b4f-proxy-tls\") pod \"machine-config-controller-84d6567774-c2542\" (UID: \"8939ecee-fc8b-415c-b820-b1dd56984b4f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2542" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.885070 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6cba0d29-ec3e-4899-a57f-1c398999e668-signing-key\") pod \"service-ca-9c57cc56f-sgsdc\" (UID: \"6cba0d29-ec3e-4899-a57f-1c398999e668\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgsdc" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.885086 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e93df4ca-fb71-49dc-8d81-0e3cca035c90-serving-cert\") pod \"service-ca-operator-777779d784-5f947\" (UID: \"e93df4ca-fb71-49dc-8d81-0e3cca035c90\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5f947" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.885111 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wpphq\" (UID: \"fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpphq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.885128 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-654fb\" (UniqueName: \"kubernetes.io/projected/d91d0313-de7a-4ba3-85c6-0b96327f9ee2-kube-api-access-654fb\") pod \"collect-profiles-29323125-z9rdd\" (UID: \"d91d0313-de7a-4ba3-85c6-0b96327f9ee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.885176 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4d92be71-105f-4c6b-90bc-f7e1097db26e-etcd-service-ca\") pod \"etcd-operator-b45778765-pzs4z\" (UID: \"4d92be71-105f-4c6b-90bc-f7e1097db26e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.885191 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04d3ac96-87c1-4034-abf4-7d08d218f135-service-ca-bundle\") pod \"router-default-5444994796-fxqfj\" (UID: \"04d3ac96-87c1-4034-abf4-7d08d218f135\") " pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.885227 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d280e549-70ac-41b1-b9ac-655328a6b6a6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lbfpr\" (UID: \"d280e549-70ac-41b1-b9ac-655328a6b6a6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbfpr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.885244 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-console-serving-cert\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.885260 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2sdk\" (UniqueName: \"kubernetes.io/projected/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-kube-api-access-l2sdk\") pod \"controller-manager-879f6c89f-nxvrf\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.885275 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/45186430-df82-4c0f-aaf7-fc032b808f34-registration-dir\") pod \"csi-hostpathplugin-rfmkw\" (UID: \"45186430-df82-4c0f-aaf7-fc032b808f34\") " pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.885291 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6xg8\" (UniqueName: \"kubernetes.io/projected/9d377e9f-0505-4ad1-8d14-21052dd36255-kube-api-access-q6xg8\") pod \"ingress-operator-5b745b69d9-7lklr\" (UID: \"9d377e9f-0505-4ad1-8d14-21052dd36255\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.885306 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3901cf8-f320-4105-a138-bebbe3ebaa60-auth-proxy-config\") pod \"machine-approver-56656f9798-lptgx\" (UID: \"e3901cf8-f320-4105-a138-bebbe3ebaa60\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lptgx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.885346 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-trusted-ca\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: E1002 06:48:45.885583 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:46.385575003 +0000 UTC m=+136.506758134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.885857 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-oauth-serving-cert\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.886182 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-console-config\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.886377 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8939ecee-fc8b-415c-b820-b1dd56984b4f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-c2542\" (UID: \"8939ecee-fc8b-415c-b820-b1dd56984b4f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2542" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.886581 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-trusted-ca\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.886595 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3901cf8-f320-4105-a138-bebbe3ebaa60-auth-proxy-config\") pod \"machine-approver-56656f9798-lptgx\" (UID: \"e3901cf8-f320-4105-a138-bebbe3ebaa60\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lptgx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.886974 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-console-oauth-config\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.887007 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d377e9f-0505-4ad1-8d14-21052dd36255-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7lklr\" (UID: \"9d377e9f-0505-4ad1-8d14-21052dd36255\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.887035 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzvl9\" (UniqueName: \"kubernetes.io/projected/4a4be49b-2638-4ce3-b349-ee4162f13f50-kube-api-access-dzvl9\") pod \"openshift-controller-manager-operator-756b6f6bc6-jw92k\" (UID: \"4a4be49b-2638-4ce3-b349-ee4162f13f50\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jw92k" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.887090 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/95c303f7-ecd5-4fb1-b071-1f1ecdd3d346-profile-collector-cert\") pod \"catalog-operator-68c6474976-jxntq\" (UID: \"95c303f7-ecd5-4fb1-b071-1f1ecdd3d346\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.887115 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.887136 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d91d0313-de7a-4ba3-85c6-0b96327f9ee2-config-volume\") pod \"collect-profiles-29323125-z9rdd\" (UID: \"d91d0313-de7a-4ba3-85c6-0b96327f9ee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.887155 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f80061e-e327-432d-a5dd-e0e671298e44-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-swzbl\" (UID: \"6f80061e-e327-432d-a5dd-e0e671298e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.887227 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbgvk\" (UniqueName: \"kubernetes.io/projected/fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba-kube-api-access-gbgvk\") pod \"openshift-config-operator-7777fb866f-wpphq\" (UID: \"fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpphq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.887248 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkbdb\" (UniqueName: \"kubernetes.io/projected/d72b8358-0199-4880-9a73-da40e64092a2-kube-api-access-jkbdb\") pod \"ingress-canary-22wrg\" (UID: \"d72b8358-0199-4880-9a73-da40e64092a2\") " pod="openshift-ingress-canary/ingress-canary-22wrg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.887270 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-registry-tls\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.887287 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3901cf8-f320-4105-a138-bebbe3ebaa60-config\") pod \"machine-approver-56656f9798-lptgx\" (UID: \"e3901cf8-f320-4105-a138-bebbe3ebaa60\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lptgx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.887318 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d280e549-70ac-41b1-b9ac-655328a6b6a6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lbfpr\" (UID: \"d280e549-70ac-41b1-b9ac-655328a6b6a6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbfpr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.887337 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be77049a-4c1d-4997-9cf5-62578b78fe6a-metrics-tls\") pod \"dns-operator-744455d44c-9lpsc\" (UID: \"be77049a-4c1d-4997-9cf5-62578b78fe6a\") " pod="openshift-dns-operator/dns-operator-744455d44c-9lpsc" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.887355 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f65r\" (UniqueName: \"kubernetes.io/projected/04d3ac96-87c1-4034-abf4-7d08d218f135-kube-api-access-5f65r\") pod \"router-default-5444994796-fxqfj\" (UID: \"04d3ac96-87c1-4034-abf4-7d08d218f135\") " pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.887385 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba-serving-cert\") pod \"openshift-config-operator-7777fb866f-wpphq\" (UID: \"fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpphq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.887403 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f33e24f-2161-4101-b5bf-ca094d98505b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9zc9c\" (UID: \"3f33e24f-2161-4101-b5bf-ca094d98505b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9zc9c" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.888146 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wpphq\" (UID: \"fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpphq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.888752 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d92be71-105f-4c6b-90bc-f7e1097db26e-serving-cert\") pod \"etcd-operator-b45778765-pzs4z\" (UID: \"4d92be71-105f-4c6b-90bc-f7e1097db26e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.889016 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntt7w\" (UniqueName: \"kubernetes.io/projected/27f8eb1a-04be-4c48-b96c-c6c928e1b9d1-kube-api-access-ntt7w\") pod \"downloads-7954f5f757-rtc95\" (UID: \"27f8eb1a-04be-4c48-b96c-c6c928e1b9d1\") " pod="openshift-console/downloads-7954f5f757-rtc95" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.889060 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/04d3ac96-87c1-4034-abf4-7d08d218f135-metrics-certs\") pod \"router-default-5444994796-fxqfj\" (UID: \"04d3ac96-87c1-4034-abf4-7d08d218f135\") " pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.889095 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d377e9f-0505-4ad1-8d14-21052dd36255-trusted-ca\") pod \"ingress-operator-5b745b69d9-7lklr\" (UID: \"9d377e9f-0505-4ad1-8d14-21052dd36255\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.889116 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f33e24f-2161-4101-b5bf-ca094d98505b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9zc9c\" (UID: \"3f33e24f-2161-4101-b5bf-ca094d98505b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9zc9c" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.889094 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d280e549-70ac-41b1-b9ac-655328a6b6a6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lbfpr\" (UID: \"d280e549-70ac-41b1-b9ac-655328a6b6a6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbfpr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.889170 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-client-ca\") pod \"controller-manager-879f6c89f-nxvrf\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.889206 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-serving-cert\") pod \"controller-manager-879f6c89f-nxvrf\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.889224 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a4be49b-2638-4ce3-b349-ee4162f13f50-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jw92k\" (UID: \"4a4be49b-2638-4ce3-b349-ee4162f13f50\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jw92k" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.889312 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e93df4ca-fb71-49dc-8d81-0e3cca035c90-config\") pod \"service-ca-operator-777779d784-5f947\" (UID: \"e93df4ca-fb71-49dc-8d81-0e3cca035c90\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5f947" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.889340 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnnrt\" (UniqueName: \"kubernetes.io/projected/e93df4ca-fb71-49dc-8d81-0e3cca035c90-kube-api-access-bnnrt\") pod \"service-ca-operator-777779d784-5f947\" (UID: \"e93df4ca-fb71-49dc-8d81-0e3cca035c90\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5f947" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.889362 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f80061e-e327-432d-a5dd-e0e671298e44-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-swzbl\" (UID: \"6f80061e-e327-432d-a5dd-e0e671298e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.889452 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d9450d4-2699-4955-84e2-3fb54a67b27e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-skstx\" (UID: \"9d9450d4-2699-4955-84e2-3fb54a67b27e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-skstx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.889480 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/45186430-df82-4c0f-aaf7-fc032b808f34-csi-data-dir\") pod \"csi-hostpathplugin-rfmkw\" (UID: \"45186430-df82-4c0f-aaf7-fc032b808f34\") " pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.889514 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d72b8358-0199-4880-9a73-da40e64092a2-cert\") pod \"ingress-canary-22wrg\" (UID: \"d72b8358-0199-4880-9a73-da40e64092a2\") " pod="openshift-ingress-canary/ingress-canary-22wrg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.889582 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-trusted-ca-bundle\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.889618 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm99v\" (UniqueName: \"kubernetes.io/projected/f2848bc4-b5b8-4a1c-b8a7-4141f41a05d9-kube-api-access-bm99v\") pod \"package-server-manager-789f6589d5-dnz65\" (UID: \"f2848bc4-b5b8-4a1c-b8a7-4141f41a05d9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dnz65" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.889647 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-registry-certificates\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.889666 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-service-ca\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.889835 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a4be49b-2638-4ce3-b349-ee4162f13f50-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jw92k\" (UID: \"4a4be49b-2638-4ce3-b349-ee4162f13f50\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jw92k" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.890274 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-client-ca\") pod \"controller-manager-879f6c89f-nxvrf\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.890649 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d377e9f-0505-4ad1-8d14-21052dd36255-trusted-ca\") pod \"ingress-operator-5b745b69d9-7lklr\" (UID: \"9d377e9f-0505-4ad1-8d14-21052dd36255\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.890929 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp9jl\" (UniqueName: \"kubernetes.io/projected/6f80061e-e327-432d-a5dd-e0e671298e44-kube-api-access-hp9jl\") pod \"marketplace-operator-79b997595-swzbl\" (UID: \"6f80061e-e327-432d-a5dd-e0e671298e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.891103 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e61739fd-b2a8-4bca-a8a8-caf1260a42bb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-k9j6b\" (UID: \"e61739fd-b2a8-4bca-a8a8-caf1260a42bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9j6b" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.891156 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/45186430-df82-4c0f-aaf7-fc032b808f34-socket-dir\") pod \"csi-hostpathplugin-rfmkw\" (UID: \"45186430-df82-4c0f-aaf7-fc032b808f34\") " pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.891189 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fds6n\" (UniqueName: \"kubernetes.io/projected/95c303f7-ecd5-4fb1-b071-1f1ecdd3d346-kube-api-access-fds6n\") pod \"catalog-operator-68c6474976-jxntq\" (UID: \"95c303f7-ecd5-4fb1-b071-1f1ecdd3d346\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.891280 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dw6h\" (UniqueName: \"kubernetes.io/projected/6cba0d29-ec3e-4899-a57f-1c398999e668-kube-api-access-8dw6h\") pod \"service-ca-9c57cc56f-sgsdc\" (UID: \"6cba0d29-ec3e-4899-a57f-1c398999e668\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgsdc" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.891364 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-bound-sa-token\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.891406 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj4p5\" (UniqueName: \"kubernetes.io/projected/63894f70-6430-4f05-b281-723de4b91274-kube-api-access-zj4p5\") pod \"multus-admission-controller-857f4d67dd-dhb25\" (UID: \"63894f70-6430-4f05-b281-723de4b91274\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dhb25" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.891448 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-service-ca\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.891528 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4d92be71-105f-4c6b-90bc-f7e1097db26e-etcd-client\") pod \"etcd-operator-b45778765-pzs4z\" (UID: \"4d92be71-105f-4c6b-90bc-f7e1097db26e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.891576 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d9450d4-2699-4955-84e2-3fb54a67b27e-config\") pod \"kube-controller-manager-operator-78b949d7b-skstx\" (UID: \"9d9450d4-2699-4955-84e2-3fb54a67b27e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-skstx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.891592 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c06e2c1-045a-4e69-8b46-da06d4d21ac7-config-volume\") pod \"dns-default-2lcqq\" (UID: \"5c06e2c1-045a-4e69-8b46-da06d4d21ac7\") " pod="openshift-dns/dns-default-2lcqq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.891610 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwbpm\" (UniqueName: \"kubernetes.io/projected/45186430-df82-4c0f-aaf7-fc032b808f34-kube-api-access-cwbpm\") pod \"csi-hostpathplugin-rfmkw\" (UID: \"45186430-df82-4c0f-aaf7-fc032b808f34\") " pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.891628 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82kl7\" (UniqueName: \"kubernetes.io/projected/31505c1f-6fd8-4227-8217-88e4f07c0e74-kube-api-access-82kl7\") pod \"machine-config-server-cbv9l\" (UID: \"31505c1f-6fd8-4227-8217-88e4f07c0e74\") " pod="openshift-machine-config-operator/machine-config-server-cbv9l" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.891751 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/04d3ac96-87c1-4034-abf4-7d08d218f135-default-certificate\") pod \"router-default-5444994796-fxqfj\" (UID: \"04d3ac96-87c1-4034-abf4-7d08d218f135\") " pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.891774 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/04d3ac96-87c1-4034-abf4-7d08d218f135-stats-auth\") pod \"router-default-5444994796-fxqfj\" (UID: \"04d3ac96-87c1-4034-abf4-7d08d218f135\") " pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.891803 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a4be49b-2638-4ce3-b349-ee4162f13f50-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jw92k\" (UID: \"4a4be49b-2638-4ce3-b349-ee4162f13f50\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jw92k" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.891820 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9l5z\" (UniqueName: \"kubernetes.io/projected/b276f4e7-8b50-4b3e-8b8b-989ddb05bca8-kube-api-access-x9l5z\") pod \"olm-operator-6b444d44fb-v9ndb\" (UID: \"b276f4e7-8b50-4b3e-8b8b-989ddb05bca8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.891876 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b4tj\" (UniqueName: \"kubernetes.io/projected/5c06e2c1-045a-4e69-8b46-da06d4d21ac7-kube-api-access-9b4tj\") pod \"dns-default-2lcqq\" (UID: \"5c06e2c1-045a-4e69-8b46-da06d4d21ac7\") " pod="openshift-dns/dns-default-2lcqq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.892009 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d91d0313-de7a-4ba3-85c6-0b96327f9ee2-secret-volume\") pod \"collect-profiles-29323125-z9rdd\" (UID: \"d91d0313-de7a-4ba3-85c6-0b96327f9ee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.892046 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h2mz\" (UniqueName: \"kubernetes.io/projected/be77049a-4c1d-4997-9cf5-62578b78fe6a-kube-api-access-7h2mz\") pod \"dns-operator-744455d44c-9lpsc\" (UID: \"be77049a-4c1d-4997-9cf5-62578b78fe6a\") " pod="openshift-dns-operator/dns-operator-744455d44c-9lpsc" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.892063 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbmqm\" (UniqueName: \"kubernetes.io/projected/24e3309e-db58-4fa4-a4f6-08fdb1ddb95c-kube-api-access-mbmqm\") pod \"control-plane-machine-set-operator-78cbb6b69f-xw7f8\" (UID: \"24e3309e-db58-4fa4-a4f6-08fdb1ddb95c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xw7f8" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.892100 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4snl9\" (UniqueName: \"kubernetes.io/projected/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-kube-api-access-4snl9\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.892117 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f33e24f-2161-4101-b5bf-ca094d98505b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9zc9c\" (UID: \"3f33e24f-2161-4101-b5bf-ca094d98505b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9zc9c" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.892172 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/95c303f7-ecd5-4fb1-b071-1f1ecdd3d346-srv-cert\") pod \"catalog-operator-68c6474976-jxntq\" (UID: \"95c303f7-ecd5-4fb1-b071-1f1ecdd3d346\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.892191 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6cba0d29-ec3e-4899-a57f-1c398999e668-signing-cabundle\") pod \"service-ca-9c57cc56f-sgsdc\" (UID: \"6cba0d29-ec3e-4899-a57f-1c398999e668\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgsdc" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.893578 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3901cf8-f320-4105-a138-bebbe3ebaa60-config\") pod \"machine-approver-56656f9798-lptgx\" (UID: \"e3901cf8-f320-4105-a138-bebbe3ebaa60\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lptgx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.893671 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-registry-certificates\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.894401 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e3901cf8-f320-4105-a138-bebbe3ebaa60-machine-approver-tls\") pod \"machine-approver-56656f9798-lptgx\" (UID: \"e3901cf8-f320-4105-a138-bebbe3ebaa60\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lptgx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.894512 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d9450d4-2699-4955-84e2-3fb54a67b27e-config\") pod \"kube-controller-manager-operator-78b949d7b-skstx\" (UID: \"9d9450d4-2699-4955-84e2-3fb54a67b27e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-skstx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.894843 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4ndf\" (UniqueName: \"kubernetes.io/projected/4d92be71-105f-4c6b-90bc-f7e1097db26e-kube-api-access-w4ndf\") pod \"etcd-operator-b45778765-pzs4z\" (UID: \"4d92be71-105f-4c6b-90bc-f7e1097db26e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.894888 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.895210 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.895335 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f33e24f-2161-4101-b5bf-ca094d98505b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9zc9c\" (UID: \"3f33e24f-2161-4101-b5bf-ca094d98505b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9zc9c" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.895361 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nxvrf\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.895540 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/31505c1f-6fd8-4227-8217-88e4f07c0e74-certs\") pod \"machine-config-server-cbv9l\" (UID: \"31505c1f-6fd8-4227-8217-88e4f07c0e74\") " pod="openshift-machine-config-operator/machine-config-server-cbv9l" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.895566 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4d92be71-105f-4c6b-90bc-f7e1097db26e-etcd-ca\") pod \"etcd-operator-b45778765-pzs4z\" (UID: \"4d92be71-105f-4c6b-90bc-f7e1097db26e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.895587 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61739fd-b2a8-4bca-a8a8-caf1260a42bb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-k9j6b\" (UID: \"e61739fd-b2a8-4bca-a8a8-caf1260a42bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9j6b" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.895626 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/63894f70-6430-4f05-b281-723de4b91274-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dhb25\" (UID: \"63894f70-6430-4f05-b281-723de4b91274\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dhb25" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.895726 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c06e2c1-045a-4e69-8b46-da06d4d21ac7-metrics-tls\") pod \"dns-default-2lcqq\" (UID: \"5c06e2c1-045a-4e69-8b46-da06d4d21ac7\") " pod="openshift-dns/dns-default-2lcqq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.895751 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r85cf\" (UniqueName: \"kubernetes.io/projected/d280e549-70ac-41b1-b9ac-655328a6b6a6-kube-api-access-r85cf\") pod \"openshift-apiserver-operator-796bbdcf4f-lbfpr\" (UID: \"d280e549-70ac-41b1-b9ac-655328a6b6a6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbfpr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.895771 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/24e3309e-db58-4fa4-a4f6-08fdb1ddb95c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xw7f8\" (UID: \"24e3309e-db58-4fa4-a4f6-08fdb1ddb95c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xw7f8" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.896037 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-config\") pod \"controller-manager-879f6c89f-nxvrf\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.896102 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e61739fd-b2a8-4bca-a8a8-caf1260a42bb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-k9j6b\" (UID: \"e61739fd-b2a8-4bca-a8a8-caf1260a42bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9j6b" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.896261 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/45186430-df82-4c0f-aaf7-fc032b808f34-plugins-dir\") pod \"csi-hostpathplugin-rfmkw\" (UID: \"45186430-df82-4c0f-aaf7-fc032b808f34\") " pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.896348 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/31505c1f-6fd8-4227-8217-88e4f07c0e74-node-bootstrap-token\") pod \"machine-config-server-cbv9l\" (UID: \"31505c1f-6fd8-4227-8217-88e4f07c0e74\") " pod="openshift-machine-config-operator/machine-config-server-cbv9l" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.896420 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h65x2\" (UniqueName: \"kubernetes.io/projected/8939ecee-fc8b-415c-b820-b1dd56984b4f-kube-api-access-h65x2\") pod \"machine-config-controller-84d6567774-c2542\" (UID: \"8939ecee-fc8b-415c-b820-b1dd56984b4f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2542" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.896491 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-console-oauth-config\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.896599 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmv2s\" (UniqueName: \"kubernetes.io/projected/e61739fd-b2a8-4bca-a8a8-caf1260a42bb-kube-api-access-tmv2s\") pod \"kube-storage-version-migrator-operator-b67b599dd-k9j6b\" (UID: \"e61739fd-b2a8-4bca-a8a8-caf1260a42bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9j6b" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.896641 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d377e9f-0505-4ad1-8d14-21052dd36255-metrics-tls\") pod \"ingress-operator-5b745b69d9-7lklr\" (UID: \"9d377e9f-0505-4ad1-8d14-21052dd36255\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.896678 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d92be71-105f-4c6b-90bc-f7e1097db26e-config\") pod \"etcd-operator-b45778765-pzs4z\" (UID: \"4d92be71-105f-4c6b-90bc-f7e1097db26e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.896735 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be77049a-4c1d-4997-9cf5-62578b78fe6a-metrics-tls\") pod \"dns-operator-744455d44c-9lpsc\" (UID: \"be77049a-4c1d-4997-9cf5-62578b78fe6a\") " pod="openshift-dns-operator/dns-operator-744455d44c-9lpsc" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.896756 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2848bc4-b5b8-4a1c-b8a7-4141f41a05d9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dnz65\" (UID: \"f2848bc4-b5b8-4a1c-b8a7-4141f41a05d9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dnz65" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.896828 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d9450d4-2699-4955-84e2-3fb54a67b27e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-skstx\" (UID: \"9d9450d4-2699-4955-84e2-3fb54a67b27e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-skstx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.896846 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-trusted-ca-bundle\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.897064 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nxvrf\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.897864 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-serving-cert\") pod \"controller-manager-879f6c89f-nxvrf\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.898027 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.899215 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f33e24f-2161-4101-b5bf-ca094d98505b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9zc9c\" (UID: \"3f33e24f-2161-4101-b5bf-ca094d98505b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9zc9c" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.899272 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-registry-tls\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.899894 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8939ecee-fc8b-415c-b820-b1dd56984b4f-proxy-tls\") pod \"machine-config-controller-84d6567774-c2542\" (UID: \"8939ecee-fc8b-415c-b820-b1dd56984b4f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2542" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.900189 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-console-serving-cert\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.900309 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d377e9f-0505-4ad1-8d14-21052dd36255-metrics-tls\") pod \"ingress-operator-5b745b69d9-7lklr\" (UID: \"9d377e9f-0505-4ad1-8d14-21052dd36255\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.902628 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a4be49b-2638-4ce3-b349-ee4162f13f50-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jw92k\" (UID: \"4a4be49b-2638-4ce3-b349-ee4162f13f50\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jw92k" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.902815 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/63894f70-6430-4f05-b281-723de4b91274-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-dhb25\" (UID: \"63894f70-6430-4f05-b281-723de4b91274\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dhb25" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.902896 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d9450d4-2699-4955-84e2-3fb54a67b27e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-skstx\" (UID: \"9d9450d4-2699-4955-84e2-3fb54a67b27e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-skstx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.905062 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba-serving-cert\") pod \"openshift-config-operator-7777fb866f-wpphq\" (UID: \"fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpphq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.905468 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d280e549-70ac-41b1-b9ac-655328a6b6a6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lbfpr\" (UID: \"d280e549-70ac-41b1-b9ac-655328a6b6a6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbfpr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.912999 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jrlfb" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.936742 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6xg8\" (UniqueName: \"kubernetes.io/projected/9d377e9f-0505-4ad1-8d14-21052dd36255-kube-api-access-q6xg8\") pod \"ingress-operator-5b745b69d9-7lklr\" (UID: \"9d377e9f-0505-4ad1-8d14-21052dd36255\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.941630 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-97s7j"] Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.945641 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5f7p6"] Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.962382 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nfwp\" (UniqueName: \"kubernetes.io/projected/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-kube-api-access-7nfwp\") pod \"console-f9d7485db-ph6hx\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.964896 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-cwznr"] Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.978232 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npzvb\" (UniqueName: \"kubernetes.io/projected/38b8798f-7843-48e8-b3ae-d0124362b72c-kube-api-access-npzvb\") pod \"migrator-59844c95c7-fbbcv\" (UID: \"38b8798f-7843-48e8-b3ae-d0124362b72c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fbbcv" Oct 02 06:48:45 crc kubenswrapper[4786]: W1002 06:48:45.980977 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d9c5c0a_e100_4718_8885_ddf84b4f4d03.slice/crio-5bf3a955d8ed08a5bdcf6b6e2dbc9c9e94ae6252caa02df6468e9cbafa3dea13 WatchSource:0}: Error finding container 5bf3a955d8ed08a5bdcf6b6e2dbc9c9e94ae6252caa02df6468e9cbafa3dea13: Status 404 returned error can't find the container with id 5bf3a955d8ed08a5bdcf6b6e2dbc9c9e94ae6252caa02df6468e9cbafa3dea13 Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.996931 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2sdk\" (UniqueName: \"kubernetes.io/projected/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-kube-api-access-l2sdk\") pod \"controller-manager-879f6c89f-nxvrf\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.997662 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:45 crc kubenswrapper[4786]: E1002 06:48:45.997770 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:46.497755091 +0000 UTC m=+136.618938223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.997841 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d92be71-105f-4c6b-90bc-f7e1097db26e-serving-cert\") pod \"etcd-operator-b45778765-pzs4z\" (UID: \"4d92be71-105f-4c6b-90bc-f7e1097db26e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.997876 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/04d3ac96-87c1-4034-abf4-7d08d218f135-metrics-certs\") pod \"router-default-5444994796-fxqfj\" (UID: \"04d3ac96-87c1-4034-abf4-7d08d218f135\") " pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.997913 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e93df4ca-fb71-49dc-8d81-0e3cca035c90-config\") pod \"service-ca-operator-777779d784-5f947\" (UID: \"e93df4ca-fb71-49dc-8d81-0e3cca035c90\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5f947" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.997930 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnnrt\" (UniqueName: \"kubernetes.io/projected/e93df4ca-fb71-49dc-8d81-0e3cca035c90-kube-api-access-bnnrt\") pod \"service-ca-operator-777779d784-5f947\" (UID: \"e93df4ca-fb71-49dc-8d81-0e3cca035c90\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5f947" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.997947 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f80061e-e327-432d-a5dd-e0e671298e44-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-swzbl\" (UID: \"6f80061e-e327-432d-a5dd-e0e671298e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.997982 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/45186430-df82-4c0f-aaf7-fc032b808f34-csi-data-dir\") pod \"csi-hostpathplugin-rfmkw\" (UID: \"45186430-df82-4c0f-aaf7-fc032b808f34\") " pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998002 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d72b8358-0199-4880-9a73-da40e64092a2-cert\") pod \"ingress-canary-22wrg\" (UID: \"d72b8358-0199-4880-9a73-da40e64092a2\") " pod="openshift-ingress-canary/ingress-canary-22wrg" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998030 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm99v\" (UniqueName: \"kubernetes.io/projected/f2848bc4-b5b8-4a1c-b8a7-4141f41a05d9-kube-api-access-bm99v\") pod \"package-server-manager-789f6589d5-dnz65\" (UID: \"f2848bc4-b5b8-4a1c-b8a7-4141f41a05d9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dnz65" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998052 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp9jl\" (UniqueName: \"kubernetes.io/projected/6f80061e-e327-432d-a5dd-e0e671298e44-kube-api-access-hp9jl\") pod \"marketplace-operator-79b997595-swzbl\" (UID: \"6f80061e-e327-432d-a5dd-e0e671298e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998078 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/45186430-df82-4c0f-aaf7-fc032b808f34-socket-dir\") pod \"csi-hostpathplugin-rfmkw\" (UID: \"45186430-df82-4c0f-aaf7-fc032b808f34\") " pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998093 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fds6n\" (UniqueName: \"kubernetes.io/projected/95c303f7-ecd5-4fb1-b071-1f1ecdd3d346-kube-api-access-fds6n\") pod \"catalog-operator-68c6474976-jxntq\" (UID: \"95c303f7-ecd5-4fb1-b071-1f1ecdd3d346\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998110 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dw6h\" (UniqueName: \"kubernetes.io/projected/6cba0d29-ec3e-4899-a57f-1c398999e668-kube-api-access-8dw6h\") pod \"service-ca-9c57cc56f-sgsdc\" (UID: \"6cba0d29-ec3e-4899-a57f-1c398999e668\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgsdc" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998137 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4d92be71-105f-4c6b-90bc-f7e1097db26e-etcd-client\") pod \"etcd-operator-b45778765-pzs4z\" (UID: \"4d92be71-105f-4c6b-90bc-f7e1097db26e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998155 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c06e2c1-045a-4e69-8b46-da06d4d21ac7-config-volume\") pod \"dns-default-2lcqq\" (UID: \"5c06e2c1-045a-4e69-8b46-da06d4d21ac7\") " pod="openshift-dns/dns-default-2lcqq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998173 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwbpm\" (UniqueName: \"kubernetes.io/projected/45186430-df82-4c0f-aaf7-fc032b808f34-kube-api-access-cwbpm\") pod \"csi-hostpathplugin-rfmkw\" (UID: \"45186430-df82-4c0f-aaf7-fc032b808f34\") " pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998190 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82kl7\" (UniqueName: \"kubernetes.io/projected/31505c1f-6fd8-4227-8217-88e4f07c0e74-kube-api-access-82kl7\") pod \"machine-config-server-cbv9l\" (UID: \"31505c1f-6fd8-4227-8217-88e4f07c0e74\") " pod="openshift-machine-config-operator/machine-config-server-cbv9l" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998207 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/04d3ac96-87c1-4034-abf4-7d08d218f135-default-certificate\") pod \"router-default-5444994796-fxqfj\" (UID: \"04d3ac96-87c1-4034-abf4-7d08d218f135\") " pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998223 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/04d3ac96-87c1-4034-abf4-7d08d218f135-stats-auth\") pod \"router-default-5444994796-fxqfj\" (UID: \"04d3ac96-87c1-4034-abf4-7d08d218f135\") " pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998242 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9l5z\" (UniqueName: \"kubernetes.io/projected/b276f4e7-8b50-4b3e-8b8b-989ddb05bca8-kube-api-access-x9l5z\") pod \"olm-operator-6b444d44fb-v9ndb\" (UID: \"b276f4e7-8b50-4b3e-8b8b-989ddb05bca8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998258 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b4tj\" (UniqueName: \"kubernetes.io/projected/5c06e2c1-045a-4e69-8b46-da06d4d21ac7-kube-api-access-9b4tj\") pod \"dns-default-2lcqq\" (UID: \"5c06e2c1-045a-4e69-8b46-da06d4d21ac7\") " pod="openshift-dns/dns-default-2lcqq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998272 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d91d0313-de7a-4ba3-85c6-0b96327f9ee2-secret-volume\") pod \"collect-profiles-29323125-z9rdd\" (UID: \"d91d0313-de7a-4ba3-85c6-0b96327f9ee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998294 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbmqm\" (UniqueName: \"kubernetes.io/projected/24e3309e-db58-4fa4-a4f6-08fdb1ddb95c-kube-api-access-mbmqm\") pod \"control-plane-machine-set-operator-78cbb6b69f-xw7f8\" (UID: \"24e3309e-db58-4fa4-a4f6-08fdb1ddb95c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xw7f8" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998325 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/95c303f7-ecd5-4fb1-b071-1f1ecdd3d346-srv-cert\") pod \"catalog-operator-68c6474976-jxntq\" (UID: \"95c303f7-ecd5-4fb1-b071-1f1ecdd3d346\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998339 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6cba0d29-ec3e-4899-a57f-1c398999e668-signing-cabundle\") pod \"service-ca-9c57cc56f-sgsdc\" (UID: \"6cba0d29-ec3e-4899-a57f-1c398999e668\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgsdc" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998355 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4ndf\" (UniqueName: \"kubernetes.io/projected/4d92be71-105f-4c6b-90bc-f7e1097db26e-kube-api-access-w4ndf\") pod \"etcd-operator-b45778765-pzs4z\" (UID: \"4d92be71-105f-4c6b-90bc-f7e1097db26e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998379 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/31505c1f-6fd8-4227-8217-88e4f07c0e74-certs\") pod \"machine-config-server-cbv9l\" (UID: \"31505c1f-6fd8-4227-8217-88e4f07c0e74\") " pod="openshift-machine-config-operator/machine-config-server-cbv9l" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998392 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4d92be71-105f-4c6b-90bc-f7e1097db26e-etcd-ca\") pod \"etcd-operator-b45778765-pzs4z\" (UID: \"4d92be71-105f-4c6b-90bc-f7e1097db26e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998409 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c06e2c1-045a-4e69-8b46-da06d4d21ac7-metrics-tls\") pod \"dns-default-2lcqq\" (UID: \"5c06e2c1-045a-4e69-8b46-da06d4d21ac7\") " pod="openshift-dns/dns-default-2lcqq" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998430 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/24e3309e-db58-4fa4-a4f6-08fdb1ddb95c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xw7f8\" (UID: \"24e3309e-db58-4fa4-a4f6-08fdb1ddb95c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xw7f8" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998446 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/45186430-df82-4c0f-aaf7-fc032b808f34-plugins-dir\") pod \"csi-hostpathplugin-rfmkw\" (UID: \"45186430-df82-4c0f-aaf7-fc032b808f34\") " pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998459 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/31505c1f-6fd8-4227-8217-88e4f07c0e74-node-bootstrap-token\") pod \"machine-config-server-cbv9l\" (UID: \"31505c1f-6fd8-4227-8217-88e4f07c0e74\") " pod="openshift-machine-config-operator/machine-config-server-cbv9l" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998484 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d92be71-105f-4c6b-90bc-f7e1097db26e-config\") pod \"etcd-operator-b45778765-pzs4z\" (UID: \"4d92be71-105f-4c6b-90bc-f7e1097db26e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998502 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2848bc4-b5b8-4a1c-b8a7-4141f41a05d9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dnz65\" (UID: \"f2848bc4-b5b8-4a1c-b8a7-4141f41a05d9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dnz65" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998536 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/45186430-df82-4c0f-aaf7-fc032b808f34-mountpoint-dir\") pod \"csi-hostpathplugin-rfmkw\" (UID: \"45186430-df82-4c0f-aaf7-fc032b808f34\") " pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998557 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b276f4e7-8b50-4b3e-8b8b-989ddb05bca8-srv-cert\") pod \"olm-operator-6b444d44fb-v9ndb\" (UID: \"b276f4e7-8b50-4b3e-8b8b-989ddb05bca8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb" Oct 02 06:48:45 crc kubenswrapper[4786]: I1002 06:48:45.998571 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b276f4e7-8b50-4b3e-8b8b-989ddb05bca8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-v9ndb\" (UID: \"b276f4e7-8b50-4b3e-8b8b-989ddb05bca8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:45.998600 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:45.998616 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6cba0d29-ec3e-4899-a57f-1c398999e668-signing-key\") pod \"service-ca-9c57cc56f-sgsdc\" (UID: \"6cba0d29-ec3e-4899-a57f-1c398999e668\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgsdc" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:45.998633 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e93df4ca-fb71-49dc-8d81-0e3cca035c90-serving-cert\") pod \"service-ca-operator-777779d784-5f947\" (UID: \"e93df4ca-fb71-49dc-8d81-0e3cca035c90\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5f947" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:45.998650 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-654fb\" (UniqueName: \"kubernetes.io/projected/d91d0313-de7a-4ba3-85c6-0b96327f9ee2-kube-api-access-654fb\") pod \"collect-profiles-29323125-z9rdd\" (UID: \"d91d0313-de7a-4ba3-85c6-0b96327f9ee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:45.999079 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/45186430-df82-4c0f-aaf7-fc032b808f34-plugins-dir\") pod \"csi-hostpathplugin-rfmkw\" (UID: \"45186430-df82-4c0f-aaf7-fc032b808f34\") " pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:45.999483 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4d92be71-105f-4c6b-90bc-f7e1097db26e-etcd-service-ca\") pod \"etcd-operator-b45778765-pzs4z\" (UID: \"4d92be71-105f-4c6b-90bc-f7e1097db26e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:45.999505 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04d3ac96-87c1-4034-abf4-7d08d218f135-service-ca-bundle\") pod \"router-default-5444994796-fxqfj\" (UID: \"04d3ac96-87c1-4034-abf4-7d08d218f135\") " pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:45.999527 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/45186430-df82-4c0f-aaf7-fc032b808f34-registration-dir\") pod \"csi-hostpathplugin-rfmkw\" (UID: \"45186430-df82-4c0f-aaf7-fc032b808f34\") " pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:45.999560 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/95c303f7-ecd5-4fb1-b071-1f1ecdd3d346-profile-collector-cert\") pod \"catalog-operator-68c6474976-jxntq\" (UID: \"95c303f7-ecd5-4fb1-b071-1f1ecdd3d346\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:45.999578 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d91d0313-de7a-4ba3-85c6-0b96327f9ee2-config-volume\") pod \"collect-profiles-29323125-z9rdd\" (UID: \"d91d0313-de7a-4ba3-85c6-0b96327f9ee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:45.999593 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f80061e-e327-432d-a5dd-e0e671298e44-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-swzbl\" (UID: \"6f80061e-e327-432d-a5dd-e0e671298e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:45.999617 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkbdb\" (UniqueName: \"kubernetes.io/projected/d72b8358-0199-4880-9a73-da40e64092a2-kube-api-access-jkbdb\") pod \"ingress-canary-22wrg\" (UID: \"d72b8358-0199-4880-9a73-da40e64092a2\") " pod="openshift-ingress-canary/ingress-canary-22wrg" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:45.999637 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f65r\" (UniqueName: \"kubernetes.io/projected/04d3ac96-87c1-4034-abf4-7d08d218f135-kube-api-access-5f65r\") pod \"router-default-5444994796-fxqfj\" (UID: \"04d3ac96-87c1-4034-abf4-7d08d218f135\") " pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.001239 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d92be71-105f-4c6b-90bc-f7e1097db26e-config\") pod \"etcd-operator-b45778765-pzs4z\" (UID: \"4d92be71-105f-4c6b-90bc-f7e1097db26e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.001554 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d91d0313-de7a-4ba3-85c6-0b96327f9ee2-config-volume\") pod \"collect-profiles-29323125-z9rdd\" (UID: \"d91d0313-de7a-4ba3-85c6-0b96327f9ee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.001611 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/45186430-df82-4c0f-aaf7-fc032b808f34-registration-dir\") pod \"csi-hostpathplugin-rfmkw\" (UID: \"45186430-df82-4c0f-aaf7-fc032b808f34\") " pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.002069 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04d3ac96-87c1-4034-abf4-7d08d218f135-service-ca-bundle\") pod \"router-default-5444994796-fxqfj\" (UID: \"04d3ac96-87c1-4034-abf4-7d08d218f135\") " pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.004120 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/45186430-df82-4c0f-aaf7-fc032b808f34-csi-data-dir\") pod \"csi-hostpathplugin-rfmkw\" (UID: \"45186430-df82-4c0f-aaf7-fc032b808f34\") " pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.005022 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6cba0d29-ec3e-4899-a57f-1c398999e668-signing-cabundle\") pod \"service-ca-9c57cc56f-sgsdc\" (UID: \"6cba0d29-ec3e-4899-a57f-1c398999e668\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgsdc" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.005025 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e93df4ca-fb71-49dc-8d81-0e3cca035c90-config\") pod \"service-ca-operator-777779d784-5f947\" (UID: \"e93df4ca-fb71-49dc-8d81-0e3cca035c90\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5f947" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.005943 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/45186430-df82-4c0f-aaf7-fc032b808f34-socket-dir\") pod \"csi-hostpathplugin-rfmkw\" (UID: \"45186430-df82-4c0f-aaf7-fc032b808f34\") " pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.006476 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4d92be71-105f-4c6b-90bc-f7e1097db26e-etcd-service-ca\") pod \"etcd-operator-b45778765-pzs4z\" (UID: \"4d92be71-105f-4c6b-90bc-f7e1097db26e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.006598 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f80061e-e327-432d-a5dd-e0e671298e44-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-swzbl\" (UID: \"6f80061e-e327-432d-a5dd-e0e671298e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.006754 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/45186430-df82-4c0f-aaf7-fc032b808f34-mountpoint-dir\") pod \"csi-hostpathplugin-rfmkw\" (UID: \"45186430-df82-4c0f-aaf7-fc032b808f34\") " pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.006776 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/04d3ac96-87c1-4034-abf4-7d08d218f135-metrics-certs\") pod \"router-default-5444994796-fxqfj\" (UID: \"04d3ac96-87c1-4034-abf4-7d08d218f135\") " pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:48:46 crc kubenswrapper[4786]: E1002 06:48:46.007081 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:46.507065049 +0000 UTC m=+136.628248181 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.007242 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/95c303f7-ecd5-4fb1-b071-1f1ecdd3d346-profile-collector-cert\") pod \"catalog-operator-68c6474976-jxntq\" (UID: \"95c303f7-ecd5-4fb1-b071-1f1ecdd3d346\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.009820 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b276f4e7-8b50-4b3e-8b8b-989ddb05bca8-srv-cert\") pod \"olm-operator-6b444d44fb-v9ndb\" (UID: \"b276f4e7-8b50-4b3e-8b8b-989ddb05bca8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.010156 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4d92be71-105f-4c6b-90bc-f7e1097db26e-etcd-ca\") pod \"etcd-operator-b45778765-pzs4z\" (UID: \"4d92be71-105f-4c6b-90bc-f7e1097db26e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.010419 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c06e2c1-045a-4e69-8b46-da06d4d21ac7-config-volume\") pod \"dns-default-2lcqq\" (UID: \"5c06e2c1-045a-4e69-8b46-da06d4d21ac7\") " pod="openshift-dns/dns-default-2lcqq" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.010515 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/31505c1f-6fd8-4227-8217-88e4f07c0e74-node-bootstrap-token\") pod \"machine-config-server-cbv9l\" (UID: \"31505c1f-6fd8-4227-8217-88e4f07c0e74\") " pod="openshift-machine-config-operator/machine-config-server-cbv9l" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.010331 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c06e2c1-045a-4e69-8b46-da06d4d21ac7-metrics-tls\") pod \"dns-default-2lcqq\" (UID: \"5c06e2c1-045a-4e69-8b46-da06d4d21ac7\") " pod="openshift-dns/dns-default-2lcqq" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.010704 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f80061e-e327-432d-a5dd-e0e671298e44-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-swzbl\" (UID: \"6f80061e-e327-432d-a5dd-e0e671298e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.012161 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d72b8358-0199-4880-9a73-da40e64092a2-cert\") pod \"ingress-canary-22wrg\" (UID: \"d72b8358-0199-4880-9a73-da40e64092a2\") " pod="openshift-ingress-canary/ingress-canary-22wrg" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.012372 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/31505c1f-6fd8-4227-8217-88e4f07c0e74-certs\") pod \"machine-config-server-cbv9l\" (UID: \"31505c1f-6fd8-4227-8217-88e4f07c0e74\") " pod="openshift-machine-config-operator/machine-config-server-cbv9l" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.012547 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/95c303f7-ecd5-4fb1-b071-1f1ecdd3d346-srv-cert\") pod \"catalog-operator-68c6474976-jxntq\" (UID: \"95c303f7-ecd5-4fb1-b071-1f1ecdd3d346\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.013992 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/04d3ac96-87c1-4034-abf4-7d08d218f135-stats-auth\") pod \"router-default-5444994796-fxqfj\" (UID: \"04d3ac96-87c1-4034-abf4-7d08d218f135\") " pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.014080 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6cba0d29-ec3e-4899-a57f-1c398999e668-signing-key\") pod \"service-ca-9c57cc56f-sgsdc\" (UID: \"6cba0d29-ec3e-4899-a57f-1c398999e668\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgsdc" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.015254 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/04d3ac96-87c1-4034-abf4-7d08d218f135-default-certificate\") pod \"router-default-5444994796-fxqfj\" (UID: \"04d3ac96-87c1-4034-abf4-7d08d218f135\") " pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.015316 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d91d0313-de7a-4ba3-85c6-0b96327f9ee2-secret-volume\") pod \"collect-profiles-29323125-z9rdd\" (UID: \"d91d0313-de7a-4ba3-85c6-0b96327f9ee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.018207 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b276f4e7-8b50-4b3e-8b8b-989ddb05bca8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-v9ndb\" (UID: \"b276f4e7-8b50-4b3e-8b8b-989ddb05bca8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.018703 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/24e3309e-db58-4fa4-a4f6-08fdb1ddb95c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xw7f8\" (UID: \"24e3309e-db58-4fa4-a4f6-08fdb1ddb95c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xw7f8" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.019559 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2848bc4-b5b8-4a1c-b8a7-4141f41a05d9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-dnz65\" (UID: \"f2848bc4-b5b8-4a1c-b8a7-4141f41a05d9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dnz65" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.020917 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e93df4ca-fb71-49dc-8d81-0e3cca035c90-serving-cert\") pod \"service-ca-operator-777779d784-5f947\" (UID: \"e93df4ca-fb71-49dc-8d81-0e3cca035c90\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5f947" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.021114 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cv8c\" (UniqueName: \"kubernetes.io/projected/e3901cf8-f320-4105-a138-bebbe3ebaa60-kube-api-access-8cv8c\") pod \"machine-approver-56656f9798-lptgx\" (UID: \"e3901cf8-f320-4105-a138-bebbe3ebaa60\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lptgx" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.021378 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d92be71-105f-4c6b-90bc-f7e1097db26e-serving-cert\") pod \"etcd-operator-b45778765-pzs4z\" (UID: \"4d92be71-105f-4c6b-90bc-f7e1097db26e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.021554 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4d92be71-105f-4c6b-90bc-f7e1097db26e-etcd-client\") pod \"etcd-operator-b45778765-pzs4z\" (UID: \"4d92be71-105f-4c6b-90bc-f7e1097db26e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.035411 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzvl9\" (UniqueName: \"kubernetes.io/projected/4a4be49b-2638-4ce3-b349-ee4162f13f50-kube-api-access-dzvl9\") pod \"openshift-controller-manager-operator-756b6f6bc6-jw92k\" (UID: \"4a4be49b-2638-4ce3-b349-ee4162f13f50\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jw92k" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.058409 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbgvk\" (UniqueName: \"kubernetes.io/projected/fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba-kube-api-access-gbgvk\") pod \"openshift-config-operator-7777fb866f-wpphq\" (UID: \"fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpphq" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.077253 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d377e9f-0505-4ad1-8d14-21052dd36255-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7lklr\" (UID: \"9d377e9f-0505-4ad1-8d14-21052dd36255\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.096715 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntt7w\" (UniqueName: \"kubernetes.io/projected/27f8eb1a-04be-4c48-b96c-c6c928e1b9d1-kube-api-access-ntt7w\") pod \"downloads-7954f5f757-rtc95\" (UID: \"27f8eb1a-04be-4c48-b96c-c6c928e1b9d1\") " pod="openshift-console/downloads-7954f5f757-rtc95" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.100289 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:46 crc kubenswrapper[4786]: E1002 06:48:46.100619 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:46.600587894 +0000 UTC m=+136.721771024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.117071 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f33e24f-2161-4101-b5bf-ca094d98505b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9zc9c\" (UID: \"3f33e24f-2161-4101-b5bf-ca094d98505b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9zc9c" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.133984 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.134793 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9d9450d4-2699-4955-84e2-3fb54a67b27e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-skstx\" (UID: \"9d9450d4-2699-4955-84e2-3fb54a67b27e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-skstx" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.144927 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9zc9c" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.155314 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj4p5\" (UniqueName: \"kubernetes.io/projected/63894f70-6430-4f05-b281-723de4b91274-kube-api-access-zj4p5\") pod \"multus-admission-controller-857f4d67dd-dhb25\" (UID: \"63894f70-6430-4f05-b281-723de4b91274\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-dhb25" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.174910 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-bound-sa-token\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.204367 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:46 crc kubenswrapper[4786]: E1002 06:48:46.205106 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:46.705088488 +0000 UTC m=+136.826271618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.208030 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h2mz\" (UniqueName: \"kubernetes.io/projected/be77049a-4c1d-4997-9cf5-62578b78fe6a-kube-api-access-7h2mz\") pod \"dns-operator-744455d44c-9lpsc\" (UID: \"be77049a-4c1d-4997-9cf5-62578b78fe6a\") " pod="openshift-dns-operator/dns-operator-744455d44c-9lpsc" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.221596 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpphq" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.222588 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4snl9\" (UniqueName: \"kubernetes.io/projected/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-kube-api-access-4snl9\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.224737 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.237265 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9lpsc" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.238845 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r85cf\" (UniqueName: \"kubernetes.io/projected/d280e549-70ac-41b1-b9ac-655328a6b6a6-kube-api-access-r85cf\") pod \"openshift-apiserver-operator-796bbdcf4f-lbfpr\" (UID: \"d280e549-70ac-41b1-b9ac-655328a6b6a6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbfpr" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.243865 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fbbcv" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.247848 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-skstx" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.254736 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h65x2\" (UniqueName: \"kubernetes.io/projected/8939ecee-fc8b-415c-b820-b1dd56984b4f-kube-api-access-h65x2\") pod \"machine-config-controller-84d6567774-c2542\" (UID: \"8939ecee-fc8b-415c-b820-b1dd56984b4f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2542" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.259491 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.264970 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-rtc95" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.276837 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmv2s\" (UniqueName: \"kubernetes.io/projected/e61739fd-b2a8-4bca-a8a8-caf1260a42bb-kube-api-access-tmv2s\") pod \"kube-storage-version-migrator-operator-b67b599dd-k9j6b\" (UID: \"e61739fd-b2a8-4bca-a8a8-caf1260a42bb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9j6b" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.278332 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lptgx" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.283039 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-dhb25" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.298365 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-97p7p"] Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.299822 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jw92k" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.301883 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q"] Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.306224 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:46 crc kubenswrapper[4786]: E1002 06:48:46.306429 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:46.806401196 +0000 UTC m=+136.927584327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.306721 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:46 crc kubenswrapper[4786]: E1002 06:48:46.307762 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:46.807747619 +0000 UTC m=+136.928930751 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.310081 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9zc9c"] Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.319310 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f65r\" (UniqueName: \"kubernetes.io/projected/04d3ac96-87c1-4034-abf4-7d08d218f135-kube-api-access-5f65r\") pod \"router-default-5444994796-fxqfj\" (UID: \"04d3ac96-87c1-4034-abf4-7d08d218f135\") " pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.339444 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwbpm\" (UniqueName: \"kubernetes.io/projected/45186430-df82-4c0f-aaf7-fc032b808f34-kube-api-access-cwbpm\") pod \"csi-hostpathplugin-rfmkw\" (UID: \"45186430-df82-4c0f-aaf7-fc032b808f34\") " pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.349835 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jrlfb"] Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.361836 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b4tj\" (UniqueName: \"kubernetes.io/projected/5c06e2c1-045a-4e69-8b46-da06d4d21ac7-kube-api-access-9b4tj\") pod \"dns-default-2lcqq\" (UID: \"5c06e2c1-045a-4e69-8b46-da06d4d21ac7\") " pod="openshift-dns/dns-default-2lcqq" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.363541 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-txfb6"] Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.385262 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-654fb\" (UniqueName: \"kubernetes.io/projected/d91d0313-de7a-4ba3-85c6-0b96327f9ee2-kube-api-access-654fb\") pod \"collect-profiles-29323125-z9rdd\" (UID: \"d91d0313-de7a-4ba3-85c6-0b96327f9ee2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.385585 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbfpr" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.386874 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nxvrf"] Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.404469 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2lcqq" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.408410 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:46 crc kubenswrapper[4786]: E1002 06:48:46.408907 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:46.908888893 +0000 UTC m=+137.030072023 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.426447 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.450740 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbmqm\" (UniqueName: \"kubernetes.io/projected/24e3309e-db58-4fa4-a4f6-08fdb1ddb95c-kube-api-access-mbmqm\") pod \"control-plane-machine-set-operator-78cbb6b69f-xw7f8\" (UID: \"24e3309e-db58-4fa4-a4f6-08fdb1ddb95c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xw7f8" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.452516 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9l5z\" (UniqueName: \"kubernetes.io/projected/b276f4e7-8b50-4b3e-8b8b-989ddb05bca8-kube-api-access-x9l5z\") pod \"olm-operator-6b444d44fb-v9ndb\" (UID: \"b276f4e7-8b50-4b3e-8b8b-989ddb05bca8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.458449 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnnrt\" (UniqueName: \"kubernetes.io/projected/e93df4ca-fb71-49dc-8d81-0e3cca035c90-kube-api-access-bnnrt\") pod \"service-ca-operator-777779d784-5f947\" (UID: \"e93df4ca-fb71-49dc-8d81-0e3cca035c90\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5f947" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.463875 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm99v\" (UniqueName: \"kubernetes.io/projected/f2848bc4-b5b8-4a1c-b8a7-4141f41a05d9-kube-api-access-bm99v\") pod \"package-server-manager-789f6589d5-dnz65\" (UID: \"f2848bc4-b5b8-4a1c-b8a7-4141f41a05d9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dnz65" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.480512 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4ndf\" (UniqueName: \"kubernetes.io/projected/4d92be71-105f-4c6b-90bc-f7e1097db26e-kube-api-access-w4ndf\") pod \"etcd-operator-b45778765-pzs4z\" (UID: \"4d92be71-105f-4c6b-90bc-f7e1097db26e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.497666 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9j6b" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.510510 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:46 crc kubenswrapper[4786]: E1002 06:48:46.510789 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:47.010774588 +0000 UTC m=+137.131957719 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.515824 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp9jl\" (UniqueName: \"kubernetes.io/projected/6f80061e-e327-432d-a5dd-e0e671298e44-kube-api-access-hp9jl\") pod \"marketplace-operator-79b997595-swzbl\" (UID: \"6f80061e-e327-432d-a5dd-e0e671298e44\") " pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.524316 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fds6n\" (UniqueName: \"kubernetes.io/projected/95c303f7-ecd5-4fb1-b071-1f1ecdd3d346-kube-api-access-fds6n\") pod \"catalog-operator-68c6474976-jxntq\" (UID: \"95c303f7-ecd5-4fb1-b071-1f1ecdd3d346\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.531498 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2542" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.537249 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkbdb\" (UniqueName: \"kubernetes.io/projected/d72b8358-0199-4880-9a73-da40e64092a2-kube-api-access-jkbdb\") pod \"ingress-canary-22wrg\" (UID: \"d72b8358-0199-4880-9a73-da40e64092a2\") " pod="openshift-ingress-canary/ingress-canary-22wrg" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.565436 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dw6h\" (UniqueName: \"kubernetes.io/projected/6cba0d29-ec3e-4899-a57f-1c398999e668-kube-api-access-8dw6h\") pod \"service-ca-9c57cc56f-sgsdc\" (UID: \"6cba0d29-ec3e-4899-a57f-1c398999e668\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgsdc" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.588417 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82kl7\" (UniqueName: \"kubernetes.io/projected/31505c1f-6fd8-4227-8217-88e4f07c0e74-kube-api-access-82kl7\") pod \"machine-config-server-cbv9l\" (UID: \"31505c1f-6fd8-4227-8217-88e4f07c0e74\") " pod="openshift-machine-config-operator/machine-config-server-cbv9l" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.600181 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5f947" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.608302 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.610972 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:46 crc kubenswrapper[4786]: E1002 06:48:46.611311 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:47.111294963 +0000 UTC m=+137.232478094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.611672 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.620124 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.624595 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.627971 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-sgsdc" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.632298 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9zc9c" event={"ID":"3f33e24f-2161-4101-b5bf-ca094d98505b","Type":"ContainerStarted","Data":"d0e670915cb1f7d5a73520071dbcc08818ddfc1d1c1d55b90d7a2e82c12e231e"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.634349 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dnz65" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.639075 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xw7f8" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.644572 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.646634 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r9ltq" event={"ID":"5097f23a-2563-4de3-891b-61b5b2bca8a1","Type":"ContainerStarted","Data":"27ca75fb21ee2eed516ad290c88830388b7097497cbf8be396d6716272880080"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.646671 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r9ltq" event={"ID":"5097f23a-2563-4de3-891b-61b5b2bca8a1","Type":"ContainerStarted","Data":"e2d29f29fe320ce881b6ce1be60975329d59ff448b09aeee74276072db310f1a"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.646702 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r9ltq" event={"ID":"5097f23a-2563-4de3-891b-61b5b2bca8a1","Type":"ContainerStarted","Data":"002e97f4047c29be6c59c20a6caa1949f381c877660107018d651c94fe5d3b09"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.651214 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.654873 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49" event={"ID":"a03810cb-3b3d-445c-a2d5-5cee24402323","Type":"ContainerStarted","Data":"e328e370d8acb13d57464bc7574d3d7925a50a3ca922ed4691e863096382ed40"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.654911 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49" event={"ID":"a03810cb-3b3d-445c-a2d5-5cee24402323","Type":"ContainerStarted","Data":"82700d975c3f8f28f3975a0a597deb9077c44f6a30f158ccebc3af896c49a3a5"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.655086 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.666228 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-97s7j" event={"ID":"eb1154c0-3781-4db0-bb82-dfad5c61316d","Type":"ContainerStarted","Data":"ed39f385af6727064419ecb304eaed49b2cff98881bdcd6e0e9264003c338d50"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.666337 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-97s7j" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.666350 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-97s7j" event={"ID":"eb1154c0-3781-4db0-bb82-dfad5c61316d","Type":"ContainerStarted","Data":"7b8509979556f0e801aa531993e8f6af4e477803b6e50a64829c29a2e4800402"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.671758 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz" event={"ID":"4291100a-fd7d-40b9-a969-23b459b6079d","Type":"ContainerStarted","Data":"f3b0c5e7c8a46cb788559d1329ad7bdd618d6d4e5ffb5b15e83e62733d2f5301"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.671808 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz" event={"ID":"4291100a-fd7d-40b9-a969-23b459b6079d","Type":"ContainerStarted","Data":"4af116902c4383da3737a6830344662a63e0fbae2800ba8d87795c59d912ac1b"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.671819 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz" event={"ID":"4291100a-fd7d-40b9-a969-23b459b6079d","Type":"ContainerStarted","Data":"aeb7a329b28cfba9fa0896067f7a7a231e90b95ab5f8b7cac36d1c7e103bb8bd"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.677550 4786 generic.go:334] "Generic (PLEG): container finished" podID="4f593c86-36f4-4f67-a149-bc8ad9dacfa9" containerID="c229926a40c0d48eb0fca6bf062b63938f27a71c38a14507856e0897ffea6039" exitCode=0 Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.677768 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" event={"ID":"4f593c86-36f4-4f67-a149-bc8ad9dacfa9","Type":"ContainerDied","Data":"c229926a40c0d48eb0fca6bf062b63938f27a71c38a14507856e0897ffea6039"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.677823 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" event={"ID":"4f593c86-36f4-4f67-a149-bc8ad9dacfa9","Type":"ContainerStarted","Data":"a5a44b9eda982a32f646e3a6618c650448003f32d6c8e25bd4a243a6706a46c5"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.684030 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cbv9l" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.684368 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh" event={"ID":"f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4","Type":"ContainerStarted","Data":"e8947fd2e461ad02a8ae0c3e31d632cc31b5f9056b802635aad132ad348ba239"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.684390 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh" event={"ID":"f4e05adf-e51a-464a-aa8d-6bfd25a8dcf4","Type":"ContainerStarted","Data":"a0c09368b5f220933960670b9077c2c5ea90d7814b441cbdc54be62e40738c82"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.685740 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jrlfb" event={"ID":"44135109-c90d-445e-970f-703ae85719cf","Type":"ContainerStarted","Data":"69f31c48e17eea6fae0b883e6669b7f1d3b4e6d9f5929db4275ae9b5a560affb"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.687919 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" event={"ID":"dbb0a742-29b1-49e8-b3cc-730f535ab08a","Type":"ContainerStarted","Data":"9ccf972bdb2454d0af161a22302498d92acfbb2b1173fc4c4675303def7a467a"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.687995 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" event={"ID":"dbb0a742-29b1-49e8-b3cc-730f535ab08a","Type":"ContainerStarted","Data":"283248af04f3b48e98776fabd9a0f469db5d2f677ac6ba2e83adec0cbc5b52d3"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.690016 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-97p7p" event={"ID":"e0050175-139a-4210-add6-1b7bbe800f27","Type":"ContainerStarted","Data":"29a143dc250d158586ee4f512f43b798f55f9b09cc65a887922efd1c9c93971a"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.694223 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" event={"ID":"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9","Type":"ContainerStarted","Data":"a7d64391a90a24288a8ae42cffb251ccb753d39c045c9738e6a66a0ef604305a"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.695933 4786 generic.go:334] "Generic (PLEG): container finished" podID="1d9c5c0a-e100-4718-8885-ddf84b4f4d03" containerID="c9158398ddf5fc5def34615b322844fe6fb07f30db37a39e9438ce18502795d8" exitCode=0 Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.696026 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" event={"ID":"1d9c5c0a-e100-4718-8885-ddf84b4f4d03","Type":"ContainerDied","Data":"c9158398ddf5fc5def34615b322844fe6fb07f30db37a39e9438ce18502795d8"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.696054 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" event={"ID":"1d9c5c0a-e100-4718-8885-ddf84b4f4d03","Type":"ContainerStarted","Data":"5bf3a955d8ed08a5bdcf6b6e2dbc9c9e94ae6252caa02df6468e9cbafa3dea13"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.697266 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lptgx" event={"ID":"e3901cf8-f320-4105-a138-bebbe3ebaa60","Type":"ContainerStarted","Data":"79c4b16fdde0767894a2ac1f32ba84bb94496af7741eab1134e78f754ecdc197"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.698668 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" event={"ID":"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb","Type":"ContainerStarted","Data":"3a443c42b61aa6f7a0691a4d32cc78eeb3f58f9e45bf08b5dc374e328afcce2b"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.702008 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" event={"ID":"6647561d-fd2f-4b60-b155-ffbc54b0da4f","Type":"ContainerStarted","Data":"bfdf0e7d3c15d0d021fd2be732ce342ccf03a7606372e3f925baac13f1ed2fe9"} Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.702138 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.707678 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-22wrg" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.712485 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:46 crc kubenswrapper[4786]: E1002 06:48:46.712779 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:47.212767044 +0000 UTC m=+137.333950175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.746272 4786 patch_prober.go:28] interesting pod/console-operator-58897d9998-97s7j container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.746458 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-97s7j" podUID="eb1154c0-3781-4db0-bb82-dfad5c61316d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.748053 4786 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-6z59q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.748131 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" podUID="6647561d-fd2f-4b60-b155-ffbc54b0da4f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.809191 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49" Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.816839 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:46 crc kubenswrapper[4786]: E1002 06:48:46.816994 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:47.316955006 +0000 UTC m=+137.438138137 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.817396 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:46 crc kubenswrapper[4786]: E1002 06:48:46.823077 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:47.323057079 +0000 UTC m=+137.444240209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.883700 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9lpsc"] Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.898084 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wpphq"] Oct 02 06:48:46 crc kubenswrapper[4786]: I1002 06:48:46.925760 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:46 crc kubenswrapper[4786]: E1002 06:48:46.926787 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:47.426765961 +0000 UTC m=+137.547949092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.027814 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:47 crc kubenswrapper[4786]: E1002 06:48:47.028325 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:47.528309738 +0000 UTC m=+137.649492869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.042758 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-fbbcv"] Oct 02 06:48:47 crc kubenswrapper[4786]: W1002 06:48:47.123021 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38b8798f_7843_48e8_b3ae_d0124362b72c.slice/crio-87122e8c526f1ff9e879de432064c38c692a8bbb7c176d8ed36bc65477c0ddbb WatchSource:0}: Error finding container 87122e8c526f1ff9e879de432064c38c692a8bbb7c176d8ed36bc65477c0ddbb: Status 404 returned error can't find the container with id 87122e8c526f1ff9e879de432064c38c692a8bbb7c176d8ed36bc65477c0ddbb Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.129012 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:47 crc kubenswrapper[4786]: E1002 06:48:47.129494 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:47.629479504 +0000 UTC m=+137.750662635 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.205510 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-rtc95"] Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.211601 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ph6hx"] Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.214252 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr"] Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.215026 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-skstx"] Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.232648 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:47 crc kubenswrapper[4786]: E1002 06:48:47.233202 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:47.733172387 +0000 UTC m=+137.854355518 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.334107 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:47 crc kubenswrapper[4786]: E1002 06:48:47.334210 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:47.834191188 +0000 UTC m=+137.955374320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.334403 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:47 crc kubenswrapper[4786]: E1002 06:48:47.334814 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:47.834794613 +0000 UTC m=+137.955977744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:47 crc kubenswrapper[4786]: W1002 06:48:47.405022 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca2df496_2124_4881_ae15_4fa5a1a4f0ea.slice/crio-a46a57ecbcd87e3a54ddd2b1c4300ed448c1c11596fff619ed978cae544b5fb8 WatchSource:0}: Error finding container a46a57ecbcd87e3a54ddd2b1c4300ed448c1c11596fff619ed978cae544b5fb8: Status 404 returned error can't find the container with id a46a57ecbcd87e3a54ddd2b1c4300ed448c1c11596fff619ed978cae544b5fb8 Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.439284 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:47 crc kubenswrapper[4786]: E1002 06:48:47.439643 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:47.939628809 +0000 UTC m=+138.060811940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.443168 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r9ltq" podStartSLOduration=120.443146591 podStartE2EDuration="2m0.443146591s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:47.43762552 +0000 UTC m=+137.558808691" watchObservedRunningTime="2025-10-02 06:48:47.443146591 +0000 UTC m=+137.564329723" Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.463599 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rfmkw"] Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.482411 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-97s7j" podStartSLOduration=120.482397625 podStartE2EDuration="2m0.482397625s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:47.481870565 +0000 UTC m=+137.603053696" watchObservedRunningTime="2025-10-02 06:48:47.482397625 +0000 UTC m=+137.603580756" Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.541729 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:47 crc kubenswrapper[4786]: E1002 06:48:47.542080 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:48.042067413 +0000 UTC m=+138.163250544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.554424 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-cwznr" podStartSLOduration=120.554406636 podStartE2EDuration="2m0.554406636s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:47.552608776 +0000 UTC m=+137.673791907" watchObservedRunningTime="2025-10-02 06:48:47.554406636 +0000 UTC m=+137.675589757" Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.643292 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:47 crc kubenswrapper[4786]: E1002 06:48:47.643894 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:48.143877906 +0000 UTC m=+138.265061037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.657551 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-dhb25"] Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.657609 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jw92k"] Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.663022 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-c2542"] Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.688017 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9j6b"] Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.703983 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbfpr"] Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.735497 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2lcqq"] Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.735627 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cbv9l" event={"ID":"31505c1f-6fd8-4227-8217-88e4f07c0e74","Type":"ContainerStarted","Data":"65770476eee978ac5d75187da9d7eebf78c2edcccb2a16d513c23e1b85d06012"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.735648 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cbv9l" event={"ID":"31505c1f-6fd8-4227-8217-88e4f07c0e74","Type":"ContainerStarted","Data":"dc183feac790a39e5e8f7664af14defe4145d28cc00cc2a48552710130546e57"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.747077 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:47 crc kubenswrapper[4786]: E1002 06:48:47.748706 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:48.24867868 +0000 UTC m=+138.369861811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.752841 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" event={"ID":"1d9c5c0a-e100-4718-8885-ddf84b4f4d03","Type":"ContainerStarted","Data":"55a7108d82d29345a4dc2e1a770de56033e657ab95c157f7098c7d66454d1fd6"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.767194 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-rtc95" event={"ID":"27f8eb1a-04be-4c48-b96c-c6c928e1b9d1","Type":"ContainerStarted","Data":"a29f2d9698a6e1c13beac03b4a5a950260296a9e18a8c2f57718b4ef31874779"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.783949 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-97p7p" event={"ID":"e0050175-139a-4210-add6-1b7bbe800f27","Type":"ContainerStarted","Data":"53b2a404617b3eb4163d35da4c69a4f0f6bbde6c551b516b60256f89966ae571"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.783999 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-97p7p" event={"ID":"e0050175-139a-4210-add6-1b7bbe800f27","Type":"ContainerStarted","Data":"09ccae439e2d1c43578aeab6764a99098ff936d793f9a2eb5edf64464595fd94"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.789221 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pzs4z"] Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.793406 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jrlfb" event={"ID":"44135109-c90d-445e-970f-703ae85719cf","Type":"ContainerStarted","Data":"a06795477a6f154dc82f01d1255c4bf19f0ea8c33ca8d4cfe638e2f2e077f308"} Oct 02 06:48:47 crc kubenswrapper[4786]: W1002 06:48:47.807907 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c06e2c1_045a_4e69_8b46_da06d4d21ac7.slice/crio-6f02ecd5a28b4cf4cb0004a06ad2998b4b7baf96bd877fa3c93ee7be788023cc WatchSource:0}: Error finding container 6f02ecd5a28b4cf4cb0004a06ad2998b4b7baf96bd877fa3c93ee7be788023cc: Status 404 returned error can't find the container with id 6f02ecd5a28b4cf4cb0004a06ad2998b4b7baf96bd877fa3c93ee7be788023cc Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.810720 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9zc9c" event={"ID":"3f33e24f-2161-4101-b5bf-ca094d98505b","Type":"ContainerStarted","Data":"9d64aee77a295b081fe0654b796ee49f9760b8868a50eb929572fc559032b058"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.818703 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd"] Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.823883 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sgsdc"] Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.831257 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" event={"ID":"45186430-df82-4c0f-aaf7-fc032b808f34","Type":"ContainerStarted","Data":"a7d82fa53fd7c652d0ce006ea05b06f779856062bc33e76891e8965caf90999b"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.847294 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ph6hx" event={"ID":"ca2df496-2124-4881-ae15-4fa5a1a4f0ea","Type":"ContainerStarted","Data":"a46a57ecbcd87e3a54ddd2b1c4300ed448c1c11596fff619ed978cae544b5fb8"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.847587 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:47 crc kubenswrapper[4786]: E1002 06:48:47.848594 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:48.34856867 +0000 UTC m=+138.469751801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.854121 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" event={"ID":"6647561d-fd2f-4b60-b155-ffbc54b0da4f","Type":"ContainerStarted","Data":"9298bce7c639cf9369af7deb476a478e6c6ad1ee02bd990a3254abefd0674d94"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.857529 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xw7f8"] Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.860310 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb"] Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.863485 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.864613 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" event={"ID":"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9","Type":"ContainerStarted","Data":"4ded5e33257b00dd32d3530875545645904eed75a5999eefeadc5c6d210368b2"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.865056 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.878939 4786 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-nxvrf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.878998 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" podUID="4d8c9c2b-6b0f-4173-91c9-1dbed356baa9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.900339 4786 generic.go:334] "Generic (PLEG): container finished" podID="fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba" containerID="d88e7d3ea70d5049e885cfc1098c72cc2eb5872f003485ff740fb48dfffc93aa" exitCode=0 Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.902109 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpphq" event={"ID":"fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba","Type":"ContainerDied","Data":"d88e7d3ea70d5049e885cfc1098c72cc2eb5872f003485ff740fb48dfffc93aa"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.902146 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpphq" event={"ID":"fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba","Type":"ContainerStarted","Data":"17bf9c788a3c485379b8baefdac0669c81b31be8fadb26caad9e21bd77e8339e"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.919013 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9lpsc" event={"ID":"be77049a-4c1d-4997-9cf5-62578b78fe6a","Type":"ContainerStarted","Data":"db54cc403e4c86834d46c78a1b85fa73fde1098bc70c75acfecc391ca5f4bf73"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.919335 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9lpsc" event={"ID":"be77049a-4c1d-4997-9cf5-62578b78fe6a","Type":"ContainerStarted","Data":"978b883599064614bbb41ea5b7061f09bc6faa09e95382c0ef9555757dd7f542"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.922198 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" podStartSLOduration=119.922180009 podStartE2EDuration="1m59.922180009s" podCreationTimestamp="2025-10-02 06:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:47.878083196 +0000 UTC m=+137.999266328" watchObservedRunningTime="2025-10-02 06:48:47.922180009 +0000 UTC m=+138.043363141" Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.923129 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" event={"ID":"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb","Type":"ContainerStarted","Data":"6d888c4431c1676477f23a4ea172d4126affc4b0b4246afe5ba488fb4d4aa27b"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.923479 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5bfpz" podStartSLOduration=120.923470558 podStartE2EDuration="2m0.923470558s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:47.911612438 +0000 UTC m=+138.032795579" watchObservedRunningTime="2025-10-02 06:48:47.923470558 +0000 UTC m=+138.044653689" Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.923547 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.935430 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq"] Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.935982 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fbbcv" event={"ID":"38b8798f-7843-48e8-b3ae-d0124362b72c","Type":"ContainerStarted","Data":"ea339c02781ab52217764385a276a68b2d2d2c91471a517265f05272ed2eaf7a"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.936018 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fbbcv" event={"ID":"38b8798f-7843-48e8-b3ae-d0124362b72c","Type":"ContainerStarted","Data":"87122e8c526f1ff9e879de432064c38c692a8bbb7c176d8ed36bc65477c0ddbb"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.949453 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:47 crc kubenswrapper[4786]: E1002 06:48:47.949871 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:48.449859557 +0000 UTC m=+138.571042677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.952266 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" event={"ID":"4f593c86-36f4-4f67-a149-bc8ad9dacfa9","Type":"ContainerStarted","Data":"6630452a941964a8c52305f2091e55ef05f982608faa8084793415466e410377"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.959682 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dnz65"] Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.960234 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr" event={"ID":"9d377e9f-0505-4ad1-8d14-21052dd36255","Type":"ContainerStarted","Data":"4a54991b6d382de3ddd1e018f2ebcaf05e61b6eaa4e8b916cd1d741046f5107d"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.960270 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr" event={"ID":"9d377e9f-0505-4ad1-8d14-21052dd36255","Type":"ContainerStarted","Data":"2b634f7b5d42acd915d72c4ea8cd2f4204dc5f0f4c03d796b63a5904f6ace0f3"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.970120 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-22wrg"] Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.986259 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lptgx" event={"ID":"e3901cf8-f320-4105-a138-bebbe3ebaa60","Type":"ContainerStarted","Data":"7767e1ebe27e4684f0111d9f15ee4a0b4652db4ef2449621231ec6ed3d25c1bd"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.986327 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lptgx" event={"ID":"e3901cf8-f320-4105-a138-bebbe3ebaa60","Type":"ContainerStarted","Data":"e70b49638b665eca3ed83c0aa107bdd77bfcff57d7380a5024fa71a4c99bb6ee"} Oct 02 06:48:47 crc kubenswrapper[4786]: I1002 06:48:47.996955 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5f947"] Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.002235 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-swzbl"] Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.013677 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-fxqfj" event={"ID":"04d3ac96-87c1-4034-abf4-7d08d218f135","Type":"ContainerStarted","Data":"b438f0b58a72c3b5bc7787ab6ce6063caa010571ec83879d42de6c635a556548"} Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.013737 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-fxqfj" event={"ID":"04d3ac96-87c1-4034-abf4-7d08d218f135","Type":"ContainerStarted","Data":"d862726978cab9ef233a3956b38b4a9c68364f7281cbaeb6354ea2afc67750c2"} Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.018575 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.040089 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-skstx" event={"ID":"9d9450d4-2699-4955-84e2-3fb54a67b27e","Type":"ContainerStarted","Data":"52650186f3c0b60e8f1e307dc4452da93cbbbf87e65e1664c9f8de8b065fd9de"} Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.052066 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-97s7j" Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.053076 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tmd49" podStartSLOduration=121.053063211 podStartE2EDuration="2m1.053063211s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:48.052376419 +0000 UTC m=+138.173559570" watchObservedRunningTime="2025-10-02 06:48:48.053063211 +0000 UTC m=+138.174246341" Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.064237 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:48 crc kubenswrapper[4786]: E1002 06:48:48.069934 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:48.569908459 +0000 UTC m=+138.691091581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.074623 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:48 crc kubenswrapper[4786]: E1002 06:48:48.081222 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:48.58119212 +0000 UTC m=+138.702375251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.182549 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:48 crc kubenswrapper[4786]: E1002 06:48:48.183429 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:48.683410656 +0000 UTC m=+138.804593787 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.283573 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:48 crc kubenswrapper[4786]: E1002 06:48:48.283875 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:48.783862071 +0000 UTC m=+138.905045202 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.283604 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-82wfh" podStartSLOduration=121.283588041 podStartE2EDuration="2m1.283588041s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:48.246113938 +0000 UTC m=+138.367297079" watchObservedRunningTime="2025-10-02 06:48:48.283588041 +0000 UTC m=+138.404771172" Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.336957 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9zc9c" podStartSLOduration=121.336936942 podStartE2EDuration="2m1.336936942s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:48.290089261 +0000 UTC m=+138.411272412" watchObservedRunningTime="2025-10-02 06:48:48.336936942 +0000 UTC m=+138.458120073" Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.364974 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" podStartSLOduration=121.364960512 podStartE2EDuration="2m1.364960512s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:48.338955059 +0000 UTC m=+138.460138200" watchObservedRunningTime="2025-10-02 06:48:48.364960512 +0000 UTC m=+138.486143642" Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.386890 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:48 crc kubenswrapper[4786]: E1002 06:48:48.387215 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:48.887191655 +0000 UTC m=+139.008374786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.402720 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lptgx" podStartSLOduration=121.402702512 podStartE2EDuration="2m1.402702512s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:48.36605123 +0000 UTC m=+138.487234361" watchObservedRunningTime="2025-10-02 06:48:48.402702512 +0000 UTC m=+138.523885643" Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.403843 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" podStartSLOduration=120.403832986 podStartE2EDuration="2m0.403832986s" podCreationTimestamp="2025-10-02 06:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:48.40113521 +0000 UTC m=+138.522318361" watchObservedRunningTime="2025-10-02 06:48:48.403832986 +0000 UTC m=+138.525016117" Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.489155 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:48 crc kubenswrapper[4786]: E1002 06:48:48.489462 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:48.98945188 +0000 UTC m=+139.110635011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.510400 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-fxqfj" podStartSLOduration=121.510382717 podStartE2EDuration="2m1.510382717s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:48.488677351 +0000 UTC m=+138.609860482" watchObservedRunningTime="2025-10-02 06:48:48.510382717 +0000 UTC m=+138.631565848" Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.510579 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-cbv9l" podStartSLOduration=5.510575222 podStartE2EDuration="5.510575222s" podCreationTimestamp="2025-10-02 06:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:48.509333036 +0000 UTC m=+138.630516177" watchObservedRunningTime="2025-10-02 06:48:48.510575222 +0000 UTC m=+138.631758353" Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.590281 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:48 crc kubenswrapper[4786]: E1002 06:48:48.604745 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:49.104714325 +0000 UTC m=+139.225897457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.617507 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-97p7p" podStartSLOduration=121.617484394 podStartE2EDuration="2m1.617484394s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:48.610940103 +0000 UTC m=+138.732123234" watchObservedRunningTime="2025-10-02 06:48:48.617484394 +0000 UTC m=+138.738667525" Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.621790 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.621922 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-skstx" podStartSLOduration=121.621901893 podStartE2EDuration="2m1.621901893s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:48.560614516 +0000 UTC m=+138.681797667" watchObservedRunningTime="2025-10-02 06:48:48.621901893 +0000 UTC m=+138.743085024" Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.624627 4786 patch_prober.go:28] interesting pod/router-default-5444994796-fxqfj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 06:48:48 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Oct 02 06:48:48 crc kubenswrapper[4786]: [+]process-running ok Oct 02 06:48:48 crc kubenswrapper[4786]: healthz check failed Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.624708 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxqfj" podUID="04d3ac96-87c1-4034-abf4-7d08d218f135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.660052 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" podStartSLOduration=121.660034554 podStartE2EDuration="2m1.660034554s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:48.659096255 +0000 UTC m=+138.780279396" watchObservedRunningTime="2025-10-02 06:48:48.660034554 +0000 UTC m=+138.781217686" Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.692414 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:48 crc kubenswrapper[4786]: E1002 06:48:48.692658 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:49.192647898 +0000 UTC m=+139.313831029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.763439 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jrlfb" podStartSLOduration=121.763419584 podStartE2EDuration="2m1.763419584s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:48.761318127 +0000 UTC m=+138.882501268" watchObservedRunningTime="2025-10-02 06:48:48.763419584 +0000 UTC m=+138.884602714" Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.793775 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:48 crc kubenswrapper[4786]: E1002 06:48:48.794204 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:49.294188499 +0000 UTC m=+139.415371630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.796240 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-ph6hx" podStartSLOduration=121.796211677 podStartE2EDuration="2m1.796211677s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:48.794412244 +0000 UTC m=+138.915595385" watchObservedRunningTime="2025-10-02 06:48:48.796211677 +0000 UTC m=+138.917394808" Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.897514 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:48 crc kubenswrapper[4786]: E1002 06:48:48.898034 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:49.398016688 +0000 UTC m=+139.519199819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.998712 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:48 crc kubenswrapper[4786]: E1002 06:48:48.998915 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:49.498887288 +0000 UTC m=+139.620070419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:48 crc kubenswrapper[4786]: I1002 06:48:48.999195 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:48 crc kubenswrapper[4786]: E1002 06:48:48.999462 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:49.499453011 +0000 UTC m=+139.620636142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.075561 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2lcqq" event={"ID":"5c06e2c1-045a-4e69-8b46-da06d4d21ac7","Type":"ContainerStarted","Data":"63a8def4af781e6130644b22816b2e9c7bd80e13f60c16e40a67498aca0b2090"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.075619 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2lcqq" event={"ID":"5c06e2c1-045a-4e69-8b46-da06d4d21ac7","Type":"ContainerStarted","Data":"6f02ecd5a28b4cf4cb0004a06ad2998b4b7baf96bd877fa3c93ee7be788023cc"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.089148 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dhb25" event={"ID":"63894f70-6430-4f05-b281-723de4b91274","Type":"ContainerStarted","Data":"d2e69ddd2ba4e63a0bb746bf6ac992ce80cb7dc840b8eb6013a110a340df7619"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.089205 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dhb25" event={"ID":"63894f70-6430-4f05-b281-723de4b91274","Type":"ContainerStarted","Data":"ef87d021c2436e3ac35bd2dbbfda96637b01e4e96aaba323a4ea75bed508bfc7"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.099904 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:49 crc kubenswrapper[4786]: E1002 06:48:49.100270 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:49.600248248 +0000 UTC m=+139.721431379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.100342 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:49 crc kubenswrapper[4786]: E1002 06:48:49.101456 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:49.601445128 +0000 UTC m=+139.722628260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.110305 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ph6hx" event={"ID":"ca2df496-2124-4881-ae15-4fa5a1a4f0ea","Type":"ContainerStarted","Data":"16985e466dcbd2d14ae593961e39a1e2b28faf3233583a99279b870e72c058b5"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.137039 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" event={"ID":"6f80061e-e327-432d-a5dd-e0e671298e44","Type":"ContainerStarted","Data":"eb56807511c4967673c9a7bc882f4c23dcf0fe668de15a2932d902cce559d862"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.170473 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb" event={"ID":"b276f4e7-8b50-4b3e-8b8b-989ddb05bca8","Type":"ContainerStarted","Data":"bdc0cced3946eed6e37d7ac44a41fb438cdd1c0c2eba96e1472df5dd8848fddb"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.170519 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb" event={"ID":"b276f4e7-8b50-4b3e-8b8b-989ddb05bca8","Type":"ContainerStarted","Data":"1e484b7cd16ed299dbd03a1e685af453b480255265fa4c2d048834c83266e962"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.171515 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.180135 4786 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-v9ndb container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.180177 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb" podUID="b276f4e7-8b50-4b3e-8b8b-989ddb05bca8" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.194786 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb" podStartSLOduration=122.194773574 podStartE2EDuration="2m2.194773574s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:49.192091968 +0000 UTC m=+139.313275109" watchObservedRunningTime="2025-10-02 06:48:49.194773574 +0000 UTC m=+139.315956705" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.200254 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9lpsc" event={"ID":"be77049a-4c1d-4997-9cf5-62578b78fe6a","Type":"ContainerStarted","Data":"aa17c8a9b3f2551653f3b4a79fa3b42a7c758a6a76989534e54a59f1d4c3bfbd"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.201166 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:49 crc kubenswrapper[4786]: E1002 06:48:49.201521 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:49.701506634 +0000 UTC m=+139.822689765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.213782 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xw7f8" event={"ID":"24e3309e-db58-4fa4-a4f6-08fdb1ddb95c","Type":"ContainerStarted","Data":"c0c7e1cc7d2d78df6ca789acffc9be53869448067008abe96436809b8a918195"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.226200 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-9lpsc" podStartSLOduration=122.226189086 podStartE2EDuration="2m2.226189086s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:49.225086224 +0000 UTC m=+139.346269355" watchObservedRunningTime="2025-10-02 06:48:49.226189086 +0000 UTC m=+139.347372217" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.235150 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-22wrg" event={"ID":"d72b8358-0199-4880-9a73-da40e64092a2","Type":"ContainerStarted","Data":"518dfd6e1a99b1304928d57dd261d112e8e0d39bc924ea77d4a40aae732231bf"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.257489 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr" event={"ID":"9d377e9f-0505-4ad1-8d14-21052dd36255","Type":"ContainerStarted","Data":"d1181f2952f9d9d89881d9c749077bf229b1ccaf65e821c8ef43521002ba3258"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.274816 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" event={"ID":"4d92be71-105f-4c6b-90bc-f7e1097db26e","Type":"ContainerStarted","Data":"cbbc9c5088b7c8ac629ad25545d012e6f2428a5eb372849ab5a8a4b3a5d83271"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.274855 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" event={"ID":"4d92be71-105f-4c6b-90bc-f7e1097db26e","Type":"ContainerStarted","Data":"8f23dced49b321b63e43bbc8424d8401bb345640dfdd0b66f3ddfe1b107ee18a"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.303448 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:49 crc kubenswrapper[4786]: E1002 06:48:49.304844 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:49.804831047 +0000 UTC m=+139.926014178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.313572 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" event={"ID":"1d9c5c0a-e100-4718-8885-ddf84b4f4d03","Type":"ContainerStarted","Data":"000f8027e9b4e9d296da117e7c4068609999d3fc4a8855753a8bb7a40b258e4c"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.319045 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7lklr" podStartSLOduration=122.319029165 podStartE2EDuration="2m2.319029165s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:49.277386496 +0000 UTC m=+139.398569637" watchObservedRunningTime="2025-10-02 06:48:49.319029165 +0000 UTC m=+139.440212296" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.341936 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpphq" event={"ID":"fd2e06bf-bd6f-47e1-bf48-d7ced01c8dba","Type":"ContainerStarted","Data":"6e634bf95fe60e039ddcc9c9bde2afc95cc2e406da58f9a4d77cf23d35e14c5a"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.342002 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpphq" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.365676 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-pzs4z" podStartSLOduration=122.365656017 podStartE2EDuration="2m2.365656017s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:49.313095912 +0000 UTC m=+139.434279053" watchObservedRunningTime="2025-10-02 06:48:49.365656017 +0000 UTC m=+139.486839148" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.365980 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" podStartSLOduration=122.365973629 podStartE2EDuration="2m2.365973629s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:49.364789643 +0000 UTC m=+139.485972784" watchObservedRunningTime="2025-10-02 06:48:49.365973629 +0000 UTC m=+139.487156761" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.368427 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2542" event={"ID":"8939ecee-fc8b-415c-b820-b1dd56984b4f","Type":"ContainerStarted","Data":"20ab0c65418c4db6789ace131a5ad29b3b4351547925dedcd61d5c81d6fa3ca3"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.368459 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2542" event={"ID":"8939ecee-fc8b-415c-b820-b1dd56984b4f","Type":"ContainerStarted","Data":"68cafad44b8c1c5885d417200dd0a6a97371af204737aae4515d5a0c95ec0b61"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.388421 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-sgsdc" event={"ID":"6cba0d29-ec3e-4899-a57f-1c398999e668","Type":"ContainerStarted","Data":"d38058b36586766d4ccc10159a3d7ac39ef7ca10a97cbe856ff504c0181a6ed6"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.388455 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-sgsdc" event={"ID":"6cba0d29-ec3e-4899-a57f-1c398999e668","Type":"ContainerStarted","Data":"28a23e0589527017441487b78c2e22a423d1a149a59142be52e47719df05acc5"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.393034 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpphq" podStartSLOduration=122.393023061 podStartE2EDuration="2m2.393023061s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:49.390068136 +0000 UTC m=+139.511251277" watchObservedRunningTime="2025-10-02 06:48:49.393023061 +0000 UTC m=+139.514206192" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.404033 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9j6b" event={"ID":"e61739fd-b2a8-4bca-a8a8-caf1260a42bb","Type":"ContainerStarted","Data":"c0c7e2ff30ce82c29e564abc1249cb259a9fb5b2638a33a2913229e4f28b3145"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.404083 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9j6b" event={"ID":"e61739fd-b2a8-4bca-a8a8-caf1260a42bb","Type":"ContainerStarted","Data":"6a6804ae8e9aea2a055d7024c7a7426fc4c125e2e8be1007def19683f00f1a80"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.404337 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:49 crc kubenswrapper[4786]: E1002 06:48:49.404554 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:49.904533962 +0000 UTC m=+140.025717094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.404836 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:49 crc kubenswrapper[4786]: E1002 06:48:49.406203 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:49.906184191 +0000 UTC m=+140.027367323 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.414187 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-rtc95" event={"ID":"27f8eb1a-04be-4c48-b96c-c6c928e1b9d1","Type":"ContainerStarted","Data":"6048011e2d86dcc10881a353f6a6915ffb86dee56a73343245663dd146abf7d7"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.414790 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-rtc95" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.430368 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-sgsdc" podStartSLOduration=121.430344915 podStartE2EDuration="2m1.430344915s" podCreationTimestamp="2025-10-02 06:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:49.410811388 +0000 UTC m=+139.531994529" watchObservedRunningTime="2025-10-02 06:48:49.430344915 +0000 UTC m=+139.551528046" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.436022 4786 patch_prober.go:28] interesting pod/downloads-7954f5f757-rtc95 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.436069 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rtc95" podUID="27f8eb1a-04be-4c48-b96c-c6c928e1b9d1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.450377 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq" event={"ID":"95c303f7-ecd5-4fb1-b071-1f1ecdd3d346","Type":"ContainerStarted","Data":"c85e007cb16adbce2e2cb85642cf1ba4aa03380afb65319752b68e8886a4a1e9"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.451287 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2542" podStartSLOduration=122.451272236 podStartE2EDuration="2m2.451272236s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:49.450772437 +0000 UTC m=+139.571955568" watchObservedRunningTime="2025-10-02 06:48:49.451272236 +0000 UTC m=+139.572455366" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.451398 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.457241 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd" event={"ID":"d91d0313-de7a-4ba3-85c6-0b96327f9ee2","Type":"ContainerStarted","Data":"8e70dda2fbaf046562f5b647932cdd5df9e7e44b9949e1fc302673642776cfc9"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.457272 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd" event={"ID":"d91d0313-de7a-4ba3-85c6-0b96327f9ee2","Type":"ContainerStarted","Data":"ef333828f9a5aff669faa13dbb56a801b89ce820d08ffc80202f2ae8be6b7e46"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.459131 4786 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-jxntq container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.459174 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq" podUID="95c303f7-ecd5-4fb1-b071-1f1ecdd3d346" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.462195 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5f947" event={"ID":"e93df4ca-fb71-49dc-8d81-0e3cca035c90","Type":"ContainerStarted","Data":"0ed698c78f1bfadf1d37a89cc43972be89ffb595989110d907cc0b069a912a32"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.469474 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dnz65" event={"ID":"f2848bc4-b5b8-4a1c-b8a7-4141f41a05d9","Type":"ContainerStarted","Data":"5a14f1c59388e2174b3f88e6cf9102132c99ab7b38191d0ae8987d1f7d0ac3e5"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.469509 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dnz65" event={"ID":"f2848bc4-b5b8-4a1c-b8a7-4141f41a05d9","Type":"ContainerStarted","Data":"cb128ff34abbe0f6969e967c047694ebe4a3752996c14eb2f39d77ded99c7dc6"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.469904 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dnz65" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.472888 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq" podStartSLOduration=122.472878794 podStartE2EDuration="2m2.472878794s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:49.471630867 +0000 UTC m=+139.592814008" watchObservedRunningTime="2025-10-02 06:48:49.472878794 +0000 UTC m=+139.594061925" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.485792 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fbbcv" event={"ID":"38b8798f-7843-48e8-b3ae-d0124362b72c","Type":"ContainerStarted","Data":"cc594a99f7e7bda15e7dcdb524158153957478b9221ba8fdf6cf6e1ea1429fe1"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.490897 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-skstx" event={"ID":"9d9450d4-2699-4955-84e2-3fb54a67b27e","Type":"ContainerStarted","Data":"a971a698f94fcba5b91d4cc6a58fcb8aa7377836986e75088871be1ece504ebc"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.507402 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.508323 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-rtc95" podStartSLOduration=122.508307236 podStartE2EDuration="2m2.508307236s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:49.507118353 +0000 UTC m=+139.628301483" watchObservedRunningTime="2025-10-02 06:48:49.508307236 +0000 UTC m=+139.629490368" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.508586 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jw92k" event={"ID":"4a4be49b-2638-4ce3-b349-ee4162f13f50","Type":"ContainerStarted","Data":"7d3803ce4977346d4a8fed08a8cdcaa7cce8d9a1b9b419d354df9af76cad6cee"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.508625 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jw92k" event={"ID":"4a4be49b-2638-4ce3-b349-ee4162f13f50","Type":"ContainerStarted","Data":"46dbb5e8fe21be6811bdbc96915519d8ebc41841095cd66c5cb381d2501ee2f1"} Oct 02 06:48:49 crc kubenswrapper[4786]: E1002 06:48:49.508397 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:50.008379675 +0000 UTC m=+140.129562806 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.511606 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbfpr" event={"ID":"d280e549-70ac-41b1-b9ac-655328a6b6a6","Type":"ContainerStarted","Data":"223d77ba6d8aa46ce867d2131848406d6f7ce2448e7bdcbf831806da97be5d49"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.511742 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbfpr" event={"ID":"d280e549-70ac-41b1-b9ac-655328a6b6a6","Type":"ContainerStarted","Data":"79e01dcae39a86eaf82dcf313e933cef55649079097afef6ac841189f431c65b"} Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.525070 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.553537 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k9j6b" podStartSLOduration=122.553510047 podStartE2EDuration="2m2.553510047s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:49.551893533 +0000 UTC m=+139.673076683" watchObservedRunningTime="2025-10-02 06:48:49.553510047 +0000 UTC m=+139.674693179" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.599451 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lbfpr" podStartSLOduration=122.599430009 podStartE2EDuration="2m2.599430009s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:49.59734212 +0000 UTC m=+139.718525270" watchObservedRunningTime="2025-10-02 06:48:49.599430009 +0000 UTC m=+139.720613141" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.611117 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:49 crc kubenswrapper[4786]: E1002 06:48:49.614604 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:50.114590963 +0000 UTC m=+140.235774094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.630088 4786 patch_prober.go:28] interesting pod/router-default-5444994796-fxqfj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 06:48:49 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Oct 02 06:48:49 crc kubenswrapper[4786]: [+]process-running ok Oct 02 06:48:49 crc kubenswrapper[4786]: healthz check failed Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.630136 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxqfj" podUID="04d3ac96-87c1-4034-abf4-7d08d218f135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.638311 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dnz65" podStartSLOduration=122.638297855 podStartE2EDuration="2m2.638297855s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:49.631663093 +0000 UTC m=+139.752846243" watchObservedRunningTime="2025-10-02 06:48:49.638297855 +0000 UTC m=+139.759480986" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.712877 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:49 crc kubenswrapper[4786]: E1002 06:48:49.713790 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:50.21377251 +0000 UTC m=+140.334955641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.768231 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-fbbcv" podStartSLOduration=122.76819697 podStartE2EDuration="2m2.76819697s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:49.762916926 +0000 UTC m=+139.884100067" watchObservedRunningTime="2025-10-02 06:48:49.76819697 +0000 UTC m=+139.889380101" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.768978 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd" podStartSLOduration=122.768970808 podStartE2EDuration="2m2.768970808s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:49.720865663 +0000 UTC m=+139.842048814" watchObservedRunningTime="2025-10-02 06:48:49.768970808 +0000 UTC m=+139.890153939" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.792284 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jw92k" podStartSLOduration=122.792271871 podStartE2EDuration="2m2.792271871s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:49.790166887 +0000 UTC m=+139.911350019" watchObservedRunningTime="2025-10-02 06:48:49.792271871 +0000 UTC m=+139.913455002" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.814608 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:49 crc kubenswrapper[4786]: E1002 06:48:49.815079 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:50.315062855 +0000 UTC m=+140.436245986 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.835556 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5f947" podStartSLOduration=121.835533149 podStartE2EDuration="2m1.835533149s" podCreationTimestamp="2025-10-02 06:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:49.832816957 +0000 UTC m=+139.954000098" watchObservedRunningTime="2025-10-02 06:48:49.835533149 +0000 UTC m=+139.956716281" Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.915796 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:49 crc kubenswrapper[4786]: E1002 06:48:49.915915 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:50.415897638 +0000 UTC m=+140.537080768 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:49 crc kubenswrapper[4786]: I1002 06:48:49.916355 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:49 crc kubenswrapper[4786]: E1002 06:48:49.916618 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:50.416609558 +0000 UTC m=+140.537792689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.017290 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:50 crc kubenswrapper[4786]: E1002 06:48:50.017388 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:50.517374006 +0000 UTC m=+140.638557137 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.017593 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:50 crc kubenswrapper[4786]: E1002 06:48:50.017871 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:50.517862052 +0000 UTC m=+140.639045183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.119563 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:50 crc kubenswrapper[4786]: E1002 06:48:50.119829 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:50.619805676 +0000 UTC m=+140.740988808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.120115 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:50 crc kubenswrapper[4786]: E1002 06:48:50.120433 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:50.620422646 +0000 UTC m=+140.741605778 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.221109 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:50 crc kubenswrapper[4786]: E1002 06:48:50.221460 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:50.721448632 +0000 UTC m=+140.842631763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.322390 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:50 crc kubenswrapper[4786]: E1002 06:48:50.322654 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:50.822641692 +0000 UTC m=+140.943824824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.423304 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:50 crc kubenswrapper[4786]: E1002 06:48:50.423450 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:50.923428253 +0000 UTC m=+141.044611384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.423822 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:50 crc kubenswrapper[4786]: E1002 06:48:50.424091 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:50.924079437 +0000 UTC m=+141.045262569 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.516619 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq" event={"ID":"95c303f7-ecd5-4fb1-b071-1f1ecdd3d346","Type":"ContainerStarted","Data":"fe5dc5c7cdef988b67e1aee9c3055b15c6fcf5589a6169c4d85ff9eb45471c68"} Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.518597 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2lcqq" event={"ID":"5c06e2c1-045a-4e69-8b46-da06d4d21ac7","Type":"ContainerStarted","Data":"9469f74002146b5ac33c7f8987bbab1fa15babb2c036ff290a3b9d0ce6f1564d"} Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.519181 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-2lcqq" Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.520463 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-22wrg" event={"ID":"d72b8358-0199-4880-9a73-da40e64092a2","Type":"ContainerStarted","Data":"d7004ee8d833f6026a924981afa0fc6f95ac8edccab2661a90957511641ac9f1"} Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.521604 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5f947" event={"ID":"e93df4ca-fb71-49dc-8d81-0e3cca035c90","Type":"ContainerStarted","Data":"92cdf5c8a85eedb999d33009689593f48a860a538907ba9b0894fd60ce361b6c"} Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.523266 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" event={"ID":"45186430-df82-4c0f-aaf7-fc032b808f34","Type":"ContainerStarted","Data":"b4682c6eaa1dfecf6479a86c0ad96a3ec6b6d40677603d49b2b13e9afa71ec2f"} Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.523291 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" event={"ID":"45186430-df82-4c0f-aaf7-fc032b808f34","Type":"ContainerStarted","Data":"4bf590071b992c9bd841b273d22658ddc5e772af24e33209febeec595336897a"} Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.524248 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:50 crc kubenswrapper[4786]: E1002 06:48:50.524395 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:51.024376299 +0000 UTC m=+141.145559431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.524478 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:50 crc kubenswrapper[4786]: E1002 06:48:50.524756 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:51.02474562 +0000 UTC m=+141.145928751 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.525191 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xw7f8" event={"ID":"24e3309e-db58-4fa4-a4f6-08fdb1ddb95c","Type":"ContainerStarted","Data":"0f4fbed3285bd6ace5d728333198e3d4c9291dc9e57bebbd08c6f0372c9d2f5f"} Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.530649 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2542" event={"ID":"8939ecee-fc8b-415c-b820-b1dd56984b4f","Type":"ContainerStarted","Data":"72751a38ae45d63ba0f6a7f70c4730f482694be16255a75b01c099fb9d62cb21"} Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.535717 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-dhb25" event={"ID":"63894f70-6430-4f05-b281-723de4b91274","Type":"ContainerStarted","Data":"e9d54b4193749c2a586f000f84b75700fc85e7236de44412b62b6d4511219649"} Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.537063 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" event={"ID":"6f80061e-e327-432d-a5dd-e0e671298e44","Type":"ContainerStarted","Data":"ed6d2ae866b79bb8579b4505203bd0ed396f58fd0bf166136c49de8f8a958d79"} Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.537591 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.540793 4786 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-swzbl container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.540821 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" podUID="6f80061e-e327-432d-a5dd-e0e671298e44" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.547391 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dnz65" event={"ID":"f2848bc4-b5b8-4a1c-b8a7-4141f41a05d9","Type":"ContainerStarted","Data":"13aef5edc0929629d09cdacf9f2f7d77f2968d411c832cf43563316b6ad0c5d1"} Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.550376 4786 patch_prober.go:28] interesting pod/downloads-7954f5f757-rtc95 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.550404 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rtc95" podUID="27f8eb1a-04be-4c48-b96c-c6c928e1b9d1" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.37:8080/\": dial tcp 10.217.0.37:8080: connect: connection refused" Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.554434 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-jxntq" Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.569479 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-22wrg" podStartSLOduration=7.569468912 podStartE2EDuration="7.569468912s" podCreationTimestamp="2025-10-02 06:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:50.568929288 +0000 UTC m=+140.690112430" watchObservedRunningTime="2025-10-02 06:48:50.569468912 +0000 UTC m=+140.690652043" Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.569885 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2lcqq" podStartSLOduration=7.569878209 podStartE2EDuration="7.569878209s" podCreationTimestamp="2025-10-02 06:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:50.534113187 +0000 UTC m=+140.655296328" watchObservedRunningTime="2025-10-02 06:48:50.569878209 +0000 UTC m=+140.691061339" Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.583282 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-v9ndb" Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.587996 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.588046 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.594487 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xw7f8" podStartSLOduration=123.594477883 podStartE2EDuration="2m3.594477883s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:50.594134001 +0000 UTC m=+140.715317142" watchObservedRunningTime="2025-10-02 06:48:50.594477883 +0000 UTC m=+140.715661015" Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.597753 4786 patch_prober.go:28] interesting pod/apiserver-76f77b778f-5f7p6 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 02 06:48:50 crc kubenswrapper[4786]: [+]log ok Oct 02 06:48:50 crc kubenswrapper[4786]: [+]etcd ok Oct 02 06:48:50 crc kubenswrapper[4786]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 02 06:48:50 crc kubenswrapper[4786]: [+]poststarthook/generic-apiserver-start-informers ok Oct 02 06:48:50 crc kubenswrapper[4786]: [+]poststarthook/max-in-flight-filter ok Oct 02 06:48:50 crc kubenswrapper[4786]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 02 06:48:50 crc kubenswrapper[4786]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 02 06:48:50 crc kubenswrapper[4786]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 02 06:48:50 crc kubenswrapper[4786]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 02 06:48:50 crc kubenswrapper[4786]: [+]poststarthook/project.openshift.io-projectcache ok Oct 02 06:48:50 crc kubenswrapper[4786]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 02 06:48:50 crc kubenswrapper[4786]: [+]poststarthook/openshift.io-startinformers ok Oct 02 06:48:50 crc kubenswrapper[4786]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 02 06:48:50 crc kubenswrapper[4786]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 02 06:48:50 crc kubenswrapper[4786]: livez check failed Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.597805 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" podUID="1d9c5c0a-e100-4718-8885-ddf84b4f4d03" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.611761 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.611792 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.625715 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:50 crc kubenswrapper[4786]: E1002 06:48:50.625953 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:51.1259326 +0000 UTC m=+141.247115731 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.627956 4786 patch_prober.go:28] interesting pod/router-default-5444994796-fxqfj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 06:48:50 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Oct 02 06:48:50 crc kubenswrapper[4786]: [+]process-running ok Oct 02 06:48:50 crc kubenswrapper[4786]: healthz check failed Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.628160 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxqfj" podUID="04d3ac96-87c1-4034-abf4-7d08d218f135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.628617 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.631967 4786 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.640104 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" podStartSLOduration=123.6400898 podStartE2EDuration="2m3.6400898s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:50.634841337 +0000 UTC m=+140.756024478" watchObservedRunningTime="2025-10-02 06:48:50.6400898 +0000 UTC m=+140.761272931" Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.713676 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-dhb25" podStartSLOduration=123.71365925 podStartE2EDuration="2m3.71365925s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:50.711485839 +0000 UTC m=+140.832668980" watchObservedRunningTime="2025-10-02 06:48:50.71365925 +0000 UTC m=+140.834842382" Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.728589 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:50 crc kubenswrapper[4786]: E1002 06:48:50.769084 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:51.269065193 +0000 UTC m=+141.390248325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.848859 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:50 crc kubenswrapper[4786]: E1002 06:48:50.849196 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:51.349183435 +0000 UTC m=+141.470366566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:50 crc kubenswrapper[4786]: I1002 06:48:50.951370 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:50 crc kubenswrapper[4786]: E1002 06:48:50.951713 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:51.451678776 +0000 UTC m=+141.572861907 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.052628 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:51 crc kubenswrapper[4786]: E1002 06:48:51.052723 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:51.552705532 +0000 UTC m=+141.673888662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.053283 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:51 crc kubenswrapper[4786]: E1002 06:48:51.053560 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:51.553552227 +0000 UTC m=+141.674735358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.154447 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:51 crc kubenswrapper[4786]: E1002 06:48:51.154589 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:51.654573934 +0000 UTC m=+141.775757065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.154852 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:51 crc kubenswrapper[4786]: E1002 06:48:51.155116 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 06:48:51.65510489 +0000 UTC m=+141.776288021 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-66zdg" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.255482 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:51 crc kubenswrapper[4786]: E1002 06:48:51.255736 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 06:48:51.755725526 +0000 UTC m=+141.876908657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.322674 4786 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-02T06:48:50.631993716Z","Handler":null,"Name":""} Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.325245 4786 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.325275 4786 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.357343 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.359797 4786 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.359835 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.378524 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-66zdg\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.458887 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.463812 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.549657 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fkn49"] Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.550810 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fkn49" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.553173 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.554577 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" event={"ID":"45186430-df82-4c0f-aaf7-fc032b808f34","Type":"ContainerStarted","Data":"4be623f513abd3a7206679b637756027b6574a0b8d37196e30d84fdf2923ed96"} Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.554614 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" event={"ID":"45186430-df82-4c0f-aaf7-fc032b808f34","Type":"ContainerStarted","Data":"fc1d805865b5d46f1dab8e738ccd55601cc35aac461b41559be50136cf936ca1"} Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.559121 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.560237 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-gbqpn" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.560651 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0929cab2-9dd2-42c2-a91f-1b98bf72ace3-catalog-content\") pod \"community-operators-fkn49\" (UID: \"0929cab2-9dd2-42c2-a91f-1b98bf72ace3\") " pod="openshift-marketplace/community-operators-fkn49" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.560726 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0929cab2-9dd2-42c2-a91f-1b98bf72ace3-utilities\") pod \"community-operators-fkn49\" (UID: \"0929cab2-9dd2-42c2-a91f-1b98bf72ace3\") " pod="openshift-marketplace/community-operators-fkn49" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.560782 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4bm5\" (UniqueName: \"kubernetes.io/projected/0929cab2-9dd2-42c2-a91f-1b98bf72ace3-kube-api-access-t4bm5\") pod \"community-operators-fkn49\" (UID: \"0929cab2-9dd2-42c2-a91f-1b98bf72ace3\") " pod="openshift-marketplace/community-operators-fkn49" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.561125 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fkn49"] Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.602383 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.623806 4786 patch_prober.go:28] interesting pod/router-default-5444994796-fxqfj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 06:48:51 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Oct 02 06:48:51 crc kubenswrapper[4786]: [+]process-running ok Oct 02 06:48:51 crc kubenswrapper[4786]: healthz check failed Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.623838 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxqfj" podUID="04d3ac96-87c1-4034-abf4-7d08d218f135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.626001 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-rfmkw" podStartSLOduration=8.62598727 podStartE2EDuration="8.62598727s" podCreationTimestamp="2025-10-02 06:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:51.602125714 +0000 UTC m=+141.723308845" watchObservedRunningTime="2025-10-02 06:48:51.62598727 +0000 UTC m=+141.747170401" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.661902 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0929cab2-9dd2-42c2-a91f-1b98bf72ace3-utilities\") pod \"community-operators-fkn49\" (UID: \"0929cab2-9dd2-42c2-a91f-1b98bf72ace3\") " pod="openshift-marketplace/community-operators-fkn49" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.661965 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4bm5\" (UniqueName: \"kubernetes.io/projected/0929cab2-9dd2-42c2-a91f-1b98bf72ace3-kube-api-access-t4bm5\") pod \"community-operators-fkn49\" (UID: \"0929cab2-9dd2-42c2-a91f-1b98bf72ace3\") " pod="openshift-marketplace/community-operators-fkn49" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.662291 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0929cab2-9dd2-42c2-a91f-1b98bf72ace3-catalog-content\") pod \"community-operators-fkn49\" (UID: \"0929cab2-9dd2-42c2-a91f-1b98bf72ace3\") " pod="openshift-marketplace/community-operators-fkn49" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.663921 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0929cab2-9dd2-42c2-a91f-1b98bf72ace3-catalog-content\") pod \"community-operators-fkn49\" (UID: \"0929cab2-9dd2-42c2-a91f-1b98bf72ace3\") " pod="openshift-marketplace/community-operators-fkn49" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.664448 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0929cab2-9dd2-42c2-a91f-1b98bf72ace3-utilities\") pod \"community-operators-fkn49\" (UID: \"0929cab2-9dd2-42c2-a91f-1b98bf72ace3\") " pod="openshift-marketplace/community-operators-fkn49" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.689678 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4bm5\" (UniqueName: \"kubernetes.io/projected/0929cab2-9dd2-42c2-a91f-1b98bf72ace3-kube-api-access-t4bm5\") pod \"community-operators-fkn49\" (UID: \"0929cab2-9dd2-42c2-a91f-1b98bf72ace3\") " pod="openshift-marketplace/community-operators-fkn49" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.746250 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c9mns"] Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.746988 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c9mns" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.751097 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.755086 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c9mns"] Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.763280 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074fa372-0ddd-47ea-a3ad-9203d7574875-catalog-content\") pod \"certified-operators-c9mns\" (UID: \"074fa372-0ddd-47ea-a3ad-9203d7574875\") " pod="openshift-marketplace/certified-operators-c9mns" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.763404 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncpmx\" (UniqueName: \"kubernetes.io/projected/074fa372-0ddd-47ea-a3ad-9203d7574875-kube-api-access-ncpmx\") pod \"certified-operators-c9mns\" (UID: \"074fa372-0ddd-47ea-a3ad-9203d7574875\") " pod="openshift-marketplace/certified-operators-c9mns" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.763438 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074fa372-0ddd-47ea-a3ad-9203d7574875-utilities\") pod \"certified-operators-c9mns\" (UID: \"074fa372-0ddd-47ea-a3ad-9203d7574875\") " pod="openshift-marketplace/certified-operators-c9mns" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.792257 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-66zdg"] Oct 02 06:48:51 crc kubenswrapper[4786]: W1002 06:48:51.797615 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf37c1514_a1f1_44b3_949d_51d1b5d4ae6e.slice/crio-ae13da93cb97fbca5e20562e2557ba48aa2cea739c07578efbad754a7a235e58 WatchSource:0}: Error finding container ae13da93cb97fbca5e20562e2557ba48aa2cea739c07578efbad754a7a235e58: Status 404 returned error can't find the container with id ae13da93cb97fbca5e20562e2557ba48aa2cea739c07578efbad754a7a235e58 Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.863228 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fkn49" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.864097 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncpmx\" (UniqueName: \"kubernetes.io/projected/074fa372-0ddd-47ea-a3ad-9203d7574875-kube-api-access-ncpmx\") pod \"certified-operators-c9mns\" (UID: \"074fa372-0ddd-47ea-a3ad-9203d7574875\") " pod="openshift-marketplace/certified-operators-c9mns" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.864123 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074fa372-0ddd-47ea-a3ad-9203d7574875-utilities\") pod \"certified-operators-c9mns\" (UID: \"074fa372-0ddd-47ea-a3ad-9203d7574875\") " pod="openshift-marketplace/certified-operators-c9mns" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.864183 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074fa372-0ddd-47ea-a3ad-9203d7574875-catalog-content\") pod \"certified-operators-c9mns\" (UID: \"074fa372-0ddd-47ea-a3ad-9203d7574875\") " pod="openshift-marketplace/certified-operators-c9mns" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.864632 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074fa372-0ddd-47ea-a3ad-9203d7574875-catalog-content\") pod \"certified-operators-c9mns\" (UID: \"074fa372-0ddd-47ea-a3ad-9203d7574875\") " pod="openshift-marketplace/certified-operators-c9mns" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.864650 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074fa372-0ddd-47ea-a3ad-9203d7574875-utilities\") pod \"certified-operators-c9mns\" (UID: \"074fa372-0ddd-47ea-a3ad-9203d7574875\") " pod="openshift-marketplace/certified-operators-c9mns" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.879542 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncpmx\" (UniqueName: \"kubernetes.io/projected/074fa372-0ddd-47ea-a3ad-9203d7574875-kube-api-access-ncpmx\") pod \"certified-operators-c9mns\" (UID: \"074fa372-0ddd-47ea-a3ad-9203d7574875\") " pod="openshift-marketplace/certified-operators-c9mns" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.950286 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rfgkp"] Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.952423 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rfgkp" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.960004 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rfgkp"] Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.967602 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g468t\" (UniqueName: \"kubernetes.io/projected/1e96c39a-5626-40ce-ad0f-f455c3292478-kube-api-access-g468t\") pod \"community-operators-rfgkp\" (UID: \"1e96c39a-5626-40ce-ad0f-f455c3292478\") " pod="openshift-marketplace/community-operators-rfgkp" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.967659 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e96c39a-5626-40ce-ad0f-f455c3292478-catalog-content\") pod \"community-operators-rfgkp\" (UID: \"1e96c39a-5626-40ce-ad0f-f455c3292478\") " pod="openshift-marketplace/community-operators-rfgkp" Oct 02 06:48:51 crc kubenswrapper[4786]: I1002 06:48:51.967773 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e96c39a-5626-40ce-ad0f-f455c3292478-utilities\") pod \"community-operators-rfgkp\" (UID: \"1e96c39a-5626-40ce-ad0f-f455c3292478\") " pod="openshift-marketplace/community-operators-rfgkp" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.008377 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fkn49"] Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.062979 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c9mns" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.070580 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g468t\" (UniqueName: \"kubernetes.io/projected/1e96c39a-5626-40ce-ad0f-f455c3292478-kube-api-access-g468t\") pod \"community-operators-rfgkp\" (UID: \"1e96c39a-5626-40ce-ad0f-f455c3292478\") " pod="openshift-marketplace/community-operators-rfgkp" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.071143 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e96c39a-5626-40ce-ad0f-f455c3292478-catalog-content\") pod \"community-operators-rfgkp\" (UID: \"1e96c39a-5626-40ce-ad0f-f455c3292478\") " pod="openshift-marketplace/community-operators-rfgkp" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.071237 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e96c39a-5626-40ce-ad0f-f455c3292478-utilities\") pod \"community-operators-rfgkp\" (UID: \"1e96c39a-5626-40ce-ad0f-f455c3292478\") " pod="openshift-marketplace/community-operators-rfgkp" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.071622 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e96c39a-5626-40ce-ad0f-f455c3292478-utilities\") pod \"community-operators-rfgkp\" (UID: \"1e96c39a-5626-40ce-ad0f-f455c3292478\") " pod="openshift-marketplace/community-operators-rfgkp" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.071903 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e96c39a-5626-40ce-ad0f-f455c3292478-catalog-content\") pod \"community-operators-rfgkp\" (UID: \"1e96c39a-5626-40ce-ad0f-f455c3292478\") " pod="openshift-marketplace/community-operators-rfgkp" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.099330 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g468t\" (UniqueName: \"kubernetes.io/projected/1e96c39a-5626-40ce-ad0f-f455c3292478-kube-api-access-g468t\") pod \"community-operators-rfgkp\" (UID: \"1e96c39a-5626-40ce-ad0f-f455c3292478\") " pod="openshift-marketplace/community-operators-rfgkp" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.144661 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6nj4j"] Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.145598 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6nj4j" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.169669 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6nj4j"] Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.172204 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1638ccfb-4503-4402-bec5-e1f8a9588c38-catalog-content\") pod \"certified-operators-6nj4j\" (UID: \"1638ccfb-4503-4402-bec5-e1f8a9588c38\") " pod="openshift-marketplace/certified-operators-6nj4j" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.172281 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1638ccfb-4503-4402-bec5-e1f8a9588c38-utilities\") pod \"certified-operators-6nj4j\" (UID: \"1638ccfb-4503-4402-bec5-e1f8a9588c38\") " pod="openshift-marketplace/certified-operators-6nj4j" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.172342 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5zrc\" (UniqueName: \"kubernetes.io/projected/1638ccfb-4503-4402-bec5-e1f8a9588c38-kube-api-access-l5zrc\") pod \"certified-operators-6nj4j\" (UID: \"1638ccfb-4503-4402-bec5-e1f8a9588c38\") " pod="openshift-marketplace/certified-operators-6nj4j" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.186726 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.229388 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wpphq" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.254786 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c9mns"] Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.267198 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rfgkp" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.273935 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1638ccfb-4503-4402-bec5-e1f8a9588c38-catalog-content\") pod \"certified-operators-6nj4j\" (UID: \"1638ccfb-4503-4402-bec5-e1f8a9588c38\") " pod="openshift-marketplace/certified-operators-6nj4j" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.274013 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1638ccfb-4503-4402-bec5-e1f8a9588c38-utilities\") pod \"certified-operators-6nj4j\" (UID: \"1638ccfb-4503-4402-bec5-e1f8a9588c38\") " pod="openshift-marketplace/certified-operators-6nj4j" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.274088 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5zrc\" (UniqueName: \"kubernetes.io/projected/1638ccfb-4503-4402-bec5-e1f8a9588c38-kube-api-access-l5zrc\") pod \"certified-operators-6nj4j\" (UID: \"1638ccfb-4503-4402-bec5-e1f8a9588c38\") " pod="openshift-marketplace/certified-operators-6nj4j" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.274814 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1638ccfb-4503-4402-bec5-e1f8a9588c38-catalog-content\") pod \"certified-operators-6nj4j\" (UID: \"1638ccfb-4503-4402-bec5-e1f8a9588c38\") " pod="openshift-marketplace/certified-operators-6nj4j" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.274876 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1638ccfb-4503-4402-bec5-e1f8a9588c38-utilities\") pod \"certified-operators-6nj4j\" (UID: \"1638ccfb-4503-4402-bec5-e1f8a9588c38\") " pod="openshift-marketplace/certified-operators-6nj4j" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.293289 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5zrc\" (UniqueName: \"kubernetes.io/projected/1638ccfb-4503-4402-bec5-e1f8a9588c38-kube-api-access-l5zrc\") pod \"certified-operators-6nj4j\" (UID: \"1638ccfb-4503-4402-bec5-e1f8a9588c38\") " pod="openshift-marketplace/certified-operators-6nj4j" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.435585 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rfgkp"] Oct 02 06:48:52 crc kubenswrapper[4786]: W1002 06:48:52.459383 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e96c39a_5626_40ce_ad0f_f455c3292478.slice/crio-391b25f705b004c1900538ca914962f65bb0b1d21385ad701b6b15e550b51727 WatchSource:0}: Error finding container 391b25f705b004c1900538ca914962f65bb0b1d21385ad701b6b15e550b51727: Status 404 returned error can't find the container with id 391b25f705b004c1900538ca914962f65bb0b1d21385ad701b6b15e550b51727 Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.521765 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6nj4j" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.560268 4786 generic.go:334] "Generic (PLEG): container finished" podID="074fa372-0ddd-47ea-a3ad-9203d7574875" containerID="a2395d8a4ff10b78509a7581e7d73b16485874768df4fd3a2e86fd9a7622cb61" exitCode=0 Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.560330 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9mns" event={"ID":"074fa372-0ddd-47ea-a3ad-9203d7574875","Type":"ContainerDied","Data":"a2395d8a4ff10b78509a7581e7d73b16485874768df4fd3a2e86fd9a7622cb61"} Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.560353 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9mns" event={"ID":"074fa372-0ddd-47ea-a3ad-9203d7574875","Type":"ContainerStarted","Data":"3341053ddcef4e9438fda668001e1033bfb123f137a043da2c511ddd2521b76b"} Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.562248 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.563282 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" event={"ID":"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e","Type":"ContainerStarted","Data":"732f58723a5dbe464dbb8d7754ca3725e70115056af00447e118d68170979533"} Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.563312 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" event={"ID":"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e","Type":"ContainerStarted","Data":"ae13da93cb97fbca5e20562e2557ba48aa2cea739c07578efbad754a7a235e58"} Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.563445 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.565369 4786 generic.go:334] "Generic (PLEG): container finished" podID="1e96c39a-5626-40ce-ad0f-f455c3292478" containerID="c0a47247e51ecec08678e63ce31aaf75c0f3439365da3ac9de4a6dc8fa0712b2" exitCode=0 Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.565404 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfgkp" event={"ID":"1e96c39a-5626-40ce-ad0f-f455c3292478","Type":"ContainerDied","Data":"c0a47247e51ecec08678e63ce31aaf75c0f3439365da3ac9de4a6dc8fa0712b2"} Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.565420 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfgkp" event={"ID":"1e96c39a-5626-40ce-ad0f-f455c3292478","Type":"ContainerStarted","Data":"391b25f705b004c1900538ca914962f65bb0b1d21385ad701b6b15e550b51727"} Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.568207 4786 generic.go:334] "Generic (PLEG): container finished" podID="0929cab2-9dd2-42c2-a91f-1b98bf72ace3" containerID="29860608d47ea5ce8fdf9b7cc84403d70aad47fc77747d605afb2d3601198764" exitCode=0 Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.569194 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkn49" event={"ID":"0929cab2-9dd2-42c2-a91f-1b98bf72ace3","Type":"ContainerDied","Data":"29860608d47ea5ce8fdf9b7cc84403d70aad47fc77747d605afb2d3601198764"} Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.569227 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkn49" event={"ID":"0929cab2-9dd2-42c2-a91f-1b98bf72ace3","Type":"ContainerStarted","Data":"5ccaab8d4876694ce729139a7c039be57c324955fe400981f0eca869cc9c9aeb"} Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.604409 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" podStartSLOduration=125.604394404 podStartE2EDuration="2m5.604394404s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:52.602791003 +0000 UTC m=+142.723974144" watchObservedRunningTime="2025-10-02 06:48:52.604394404 +0000 UTC m=+142.725577534" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.628562 4786 patch_prober.go:28] interesting pod/router-default-5444994796-fxqfj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 06:48:52 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Oct 02 06:48:52 crc kubenswrapper[4786]: [+]process-running ok Oct 02 06:48:52 crc kubenswrapper[4786]: healthz check failed Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.628604 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxqfj" podUID="04d3ac96-87c1-4034-abf4-7d08d218f135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.642445 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.644685 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.646943 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.647339 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.649990 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.680907 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4880b62-d5a5-402a-b285-f6c0ca7e3f45-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f4880b62-d5a5-402a-b285-f6c0ca7e3f45\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.681116 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4880b62-d5a5-402a-b285-f6c0ca7e3f45-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f4880b62-d5a5-402a-b285-f6c0ca7e3f45\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.782774 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4880b62-d5a5-402a-b285-f6c0ca7e3f45-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f4880b62-d5a5-402a-b285-f6c0ca7e3f45\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.782878 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4880b62-d5a5-402a-b285-f6c0ca7e3f45-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f4880b62-d5a5-402a-b285-f6c0ca7e3f45\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.782947 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4880b62-d5a5-402a-b285-f6c0ca7e3f45-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f4880b62-d5a5-402a-b285-f6c0ca7e3f45\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.797720 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4880b62-d5a5-402a-b285-f6c0ca7e3f45-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f4880b62-d5a5-402a-b285-f6c0ca7e3f45\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.881521 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6nj4j"] Oct 02 06:48:52 crc kubenswrapper[4786]: I1002 06:48:52.959545 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.088650 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 06:48:53 crc kubenswrapper[4786]: W1002 06:48:53.102804 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf4880b62_d5a5_402a_b285_f6c0ca7e3f45.slice/crio-4030f061f3cade302fa1c28ef28ac4a9401ef61b921e32f4c626105d75d73d5e WatchSource:0}: Error finding container 4030f061f3cade302fa1c28ef28ac4a9401ef61b921e32f4c626105d75d73d5e: Status 404 returned error can't find the container with id 4030f061f3cade302fa1c28ef28ac4a9401ef61b921e32f4c626105d75d73d5e Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.588389 4786 generic.go:334] "Generic (PLEG): container finished" podID="d91d0313-de7a-4ba3-85c6-0b96327f9ee2" containerID="8e70dda2fbaf046562f5b647932cdd5df9e7e44b9949e1fc302673642776cfc9" exitCode=0 Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.588471 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd" event={"ID":"d91d0313-de7a-4ba3-85c6-0b96327f9ee2","Type":"ContainerDied","Data":"8e70dda2fbaf046562f5b647932cdd5df9e7e44b9949e1fc302673642776cfc9"} Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.593715 4786 generic.go:334] "Generic (PLEG): container finished" podID="1638ccfb-4503-4402-bec5-e1f8a9588c38" containerID="084cc52cb555b4068d7495460f6a7366cf16f87c53f95c8820a058891b28daaa" exitCode=0 Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.593785 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nj4j" event={"ID":"1638ccfb-4503-4402-bec5-e1f8a9588c38","Type":"ContainerDied","Data":"084cc52cb555b4068d7495460f6a7366cf16f87c53f95c8820a058891b28daaa"} Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.593814 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nj4j" event={"ID":"1638ccfb-4503-4402-bec5-e1f8a9588c38","Type":"ContainerStarted","Data":"06f08de8521fe8102e9beaf6145a970c441601e3fd7afc37dd8565e006ca8143"} Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.596208 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f4880b62-d5a5-402a-b285-f6c0ca7e3f45","Type":"ContainerStarted","Data":"822eec69eaa7dc1c95352f013db6490176a7b1c71d99b3a190df489dbdf4cebc"} Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.596268 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f4880b62-d5a5-402a-b285-f6c0ca7e3f45","Type":"ContainerStarted","Data":"4030f061f3cade302fa1c28ef28ac4a9401ef61b921e32f4c626105d75d73d5e"} Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.623882 4786 patch_prober.go:28] interesting pod/router-default-5444994796-fxqfj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 06:48:53 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Oct 02 06:48:53 crc kubenswrapper[4786]: [+]process-running ok Oct 02 06:48:53 crc kubenswrapper[4786]: healthz check failed Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.623926 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxqfj" podUID="04d3ac96-87c1-4034-abf4-7d08d218f135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.637478 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.637460263 podStartE2EDuration="1.637460263s" podCreationTimestamp="2025-10-02 06:48:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:48:53.626891137 +0000 UTC m=+143.748074278" watchObservedRunningTime="2025-10-02 06:48:53.637460263 +0000 UTC m=+143.758643393" Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.749195 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tbkbh"] Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.750638 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tbkbh" Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.753977 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.758240 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tbkbh"] Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.801201 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rn72\" (UniqueName: \"kubernetes.io/projected/dfdbf275-781b-4a7d-b943-b592d682d11a-kube-api-access-4rn72\") pod \"redhat-marketplace-tbkbh\" (UID: \"dfdbf275-781b-4a7d-b943-b592d682d11a\") " pod="openshift-marketplace/redhat-marketplace-tbkbh" Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.801315 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfdbf275-781b-4a7d-b943-b592d682d11a-utilities\") pod \"redhat-marketplace-tbkbh\" (UID: \"dfdbf275-781b-4a7d-b943-b592d682d11a\") " pod="openshift-marketplace/redhat-marketplace-tbkbh" Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.801653 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfdbf275-781b-4a7d-b943-b592d682d11a-catalog-content\") pod \"redhat-marketplace-tbkbh\" (UID: \"dfdbf275-781b-4a7d-b943-b592d682d11a\") " pod="openshift-marketplace/redhat-marketplace-tbkbh" Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.902198 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfdbf275-781b-4a7d-b943-b592d682d11a-catalog-content\") pod \"redhat-marketplace-tbkbh\" (UID: \"dfdbf275-781b-4a7d-b943-b592d682d11a\") " pod="openshift-marketplace/redhat-marketplace-tbkbh" Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.902250 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rn72\" (UniqueName: \"kubernetes.io/projected/dfdbf275-781b-4a7d-b943-b592d682d11a-kube-api-access-4rn72\") pod \"redhat-marketplace-tbkbh\" (UID: \"dfdbf275-781b-4a7d-b943-b592d682d11a\") " pod="openshift-marketplace/redhat-marketplace-tbkbh" Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.902290 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfdbf275-781b-4a7d-b943-b592d682d11a-utilities\") pod \"redhat-marketplace-tbkbh\" (UID: \"dfdbf275-781b-4a7d-b943-b592d682d11a\") " pod="openshift-marketplace/redhat-marketplace-tbkbh" Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.902761 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfdbf275-781b-4a7d-b943-b592d682d11a-utilities\") pod \"redhat-marketplace-tbkbh\" (UID: \"dfdbf275-781b-4a7d-b943-b592d682d11a\") " pod="openshift-marketplace/redhat-marketplace-tbkbh" Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.902910 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfdbf275-781b-4a7d-b943-b592d682d11a-catalog-content\") pod \"redhat-marketplace-tbkbh\" (UID: \"dfdbf275-781b-4a7d-b943-b592d682d11a\") " pod="openshift-marketplace/redhat-marketplace-tbkbh" Oct 02 06:48:53 crc kubenswrapper[4786]: I1002 06:48:53.916881 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rn72\" (UniqueName: \"kubernetes.io/projected/dfdbf275-781b-4a7d-b943-b592d682d11a-kube-api-access-4rn72\") pod \"redhat-marketplace-tbkbh\" (UID: \"dfdbf275-781b-4a7d-b943-b592d682d11a\") " pod="openshift-marketplace/redhat-marketplace-tbkbh" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.066461 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tbkbh" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.146529 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ghfbw"] Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.147443 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghfbw" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.161151 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghfbw"] Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.205724 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/787bb954-7ee0-4a9d-ba2b-9f8352adba43-catalog-content\") pod \"redhat-marketplace-ghfbw\" (UID: \"787bb954-7ee0-4a9d-ba2b-9f8352adba43\") " pod="openshift-marketplace/redhat-marketplace-ghfbw" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.205796 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxn5p\" (UniqueName: \"kubernetes.io/projected/787bb954-7ee0-4a9d-ba2b-9f8352adba43-kube-api-access-fxn5p\") pod \"redhat-marketplace-ghfbw\" (UID: \"787bb954-7ee0-4a9d-ba2b-9f8352adba43\") " pod="openshift-marketplace/redhat-marketplace-ghfbw" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.205852 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/787bb954-7ee0-4a9d-ba2b-9f8352adba43-utilities\") pod \"redhat-marketplace-ghfbw\" (UID: \"787bb954-7ee0-4a9d-ba2b-9f8352adba43\") " pod="openshift-marketplace/redhat-marketplace-ghfbw" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.245418 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tbkbh"] Oct 02 06:48:54 crc kubenswrapper[4786]: W1002 06:48:54.250780 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfdbf275_781b_4a7d_b943_b592d682d11a.slice/crio-34d54367b6d6bc6fa99262e5b9962a68da4bc1e49c6f6d355ce81c0d9da1577c WatchSource:0}: Error finding container 34d54367b6d6bc6fa99262e5b9962a68da4bc1e49c6f6d355ce81c0d9da1577c: Status 404 returned error can't find the container with id 34d54367b6d6bc6fa99262e5b9962a68da4bc1e49c6f6d355ce81c0d9da1577c Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.307464 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxn5p\" (UniqueName: \"kubernetes.io/projected/787bb954-7ee0-4a9d-ba2b-9f8352adba43-kube-api-access-fxn5p\") pod \"redhat-marketplace-ghfbw\" (UID: \"787bb954-7ee0-4a9d-ba2b-9f8352adba43\") " pod="openshift-marketplace/redhat-marketplace-ghfbw" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.307566 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/787bb954-7ee0-4a9d-ba2b-9f8352adba43-utilities\") pod \"redhat-marketplace-ghfbw\" (UID: \"787bb954-7ee0-4a9d-ba2b-9f8352adba43\") " pod="openshift-marketplace/redhat-marketplace-ghfbw" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.307644 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/787bb954-7ee0-4a9d-ba2b-9f8352adba43-catalog-content\") pod \"redhat-marketplace-ghfbw\" (UID: \"787bb954-7ee0-4a9d-ba2b-9f8352adba43\") " pod="openshift-marketplace/redhat-marketplace-ghfbw" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.308188 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/787bb954-7ee0-4a9d-ba2b-9f8352adba43-utilities\") pod \"redhat-marketplace-ghfbw\" (UID: \"787bb954-7ee0-4a9d-ba2b-9f8352adba43\") " pod="openshift-marketplace/redhat-marketplace-ghfbw" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.308204 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/787bb954-7ee0-4a9d-ba2b-9f8352adba43-catalog-content\") pod \"redhat-marketplace-ghfbw\" (UID: \"787bb954-7ee0-4a9d-ba2b-9f8352adba43\") " pod="openshift-marketplace/redhat-marketplace-ghfbw" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.321731 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxn5p\" (UniqueName: \"kubernetes.io/projected/787bb954-7ee0-4a9d-ba2b-9f8352adba43-kube-api-access-fxn5p\") pod \"redhat-marketplace-ghfbw\" (UID: \"787bb954-7ee0-4a9d-ba2b-9f8352adba43\") " pod="openshift-marketplace/redhat-marketplace-ghfbw" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.465547 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghfbw" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.604208 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4880b62-d5a5-402a-b285-f6c0ca7e3f45" containerID="822eec69eaa7dc1c95352f013db6490176a7b1c71d99b3a190df489dbdf4cebc" exitCode=0 Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.604295 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f4880b62-d5a5-402a-b285-f6c0ca7e3f45","Type":"ContainerDied","Data":"822eec69eaa7dc1c95352f013db6490176a7b1c71d99b3a190df489dbdf4cebc"} Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.608081 4786 generic.go:334] "Generic (PLEG): container finished" podID="dfdbf275-781b-4a7d-b943-b592d682d11a" containerID="4bdd25ff4b4bcaead72b607c9b4d89a740a5c02e03c43664346f84fba6bebf76" exitCode=0 Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.608270 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbkbh" event={"ID":"dfdbf275-781b-4a7d-b943-b592d682d11a","Type":"ContainerDied","Data":"4bdd25ff4b4bcaead72b607c9b4d89a740a5c02e03c43664346f84fba6bebf76"} Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.608295 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbkbh" event={"ID":"dfdbf275-781b-4a7d-b943-b592d682d11a","Type":"ContainerStarted","Data":"34d54367b6d6bc6fa99262e5b9962a68da4bc1e49c6f6d355ce81c0d9da1577c"} Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.623112 4786 patch_prober.go:28] interesting pod/router-default-5444994796-fxqfj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 06:48:54 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Oct 02 06:48:54 crc kubenswrapper[4786]: [+]process-running ok Oct 02 06:48:54 crc kubenswrapper[4786]: healthz check failed Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.623144 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxqfj" podUID="04d3ac96-87c1-4034-abf4-7d08d218f135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.673019 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghfbw"] Oct 02 06:48:54 crc kubenswrapper[4786]: W1002 06:48:54.680320 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod787bb954_7ee0_4a9d_ba2b_9f8352adba43.slice/crio-7bd883e0aaacc83b7aa091210fc8c7a1d965af9430f485e809f826b4d14151c9 WatchSource:0}: Error finding container 7bd883e0aaacc83b7aa091210fc8c7a1d965af9430f485e809f826b4d14151c9: Status 404 returned error can't find the container with id 7bd883e0aaacc83b7aa091210fc8c7a1d965af9430f485e809f826b4d14151c9 Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.747996 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fskng"] Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.748906 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fskng" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.756602 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.763145 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fskng"] Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.814978 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68893526-5b68-42b3-8711-a11fed5996e7-catalog-content\") pod \"redhat-operators-fskng\" (UID: \"68893526-5b68-42b3-8711-a11fed5996e7\") " pod="openshift-marketplace/redhat-operators-fskng" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.815093 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbsv6\" (UniqueName: \"kubernetes.io/projected/68893526-5b68-42b3-8711-a11fed5996e7-kube-api-access-kbsv6\") pod \"redhat-operators-fskng\" (UID: \"68893526-5b68-42b3-8711-a11fed5996e7\") " pod="openshift-marketplace/redhat-operators-fskng" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.815174 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68893526-5b68-42b3-8711-a11fed5996e7-utilities\") pod \"redhat-operators-fskng\" (UID: \"68893526-5b68-42b3-8711-a11fed5996e7\") " pod="openshift-marketplace/redhat-operators-fskng" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.823007 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.916179 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-654fb\" (UniqueName: \"kubernetes.io/projected/d91d0313-de7a-4ba3-85c6-0b96327f9ee2-kube-api-access-654fb\") pod \"d91d0313-de7a-4ba3-85c6-0b96327f9ee2\" (UID: \"d91d0313-de7a-4ba3-85c6-0b96327f9ee2\") " Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.916271 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d91d0313-de7a-4ba3-85c6-0b96327f9ee2-config-volume\") pod \"d91d0313-de7a-4ba3-85c6-0b96327f9ee2\" (UID: \"d91d0313-de7a-4ba3-85c6-0b96327f9ee2\") " Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.916292 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d91d0313-de7a-4ba3-85c6-0b96327f9ee2-secret-volume\") pod \"d91d0313-de7a-4ba3-85c6-0b96327f9ee2\" (UID: \"d91d0313-de7a-4ba3-85c6-0b96327f9ee2\") " Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.916432 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68893526-5b68-42b3-8711-a11fed5996e7-utilities\") pod \"redhat-operators-fskng\" (UID: \"68893526-5b68-42b3-8711-a11fed5996e7\") " pod="openshift-marketplace/redhat-operators-fskng" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.916502 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68893526-5b68-42b3-8711-a11fed5996e7-catalog-content\") pod \"redhat-operators-fskng\" (UID: \"68893526-5b68-42b3-8711-a11fed5996e7\") " pod="openshift-marketplace/redhat-operators-fskng" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.916568 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbsv6\" (UniqueName: \"kubernetes.io/projected/68893526-5b68-42b3-8711-a11fed5996e7-kube-api-access-kbsv6\") pod \"redhat-operators-fskng\" (UID: \"68893526-5b68-42b3-8711-a11fed5996e7\") " pod="openshift-marketplace/redhat-operators-fskng" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.917477 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d91d0313-de7a-4ba3-85c6-0b96327f9ee2-config-volume" (OuterVolumeSpecName: "config-volume") pod "d91d0313-de7a-4ba3-85c6-0b96327f9ee2" (UID: "d91d0313-de7a-4ba3-85c6-0b96327f9ee2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.917678 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68893526-5b68-42b3-8711-a11fed5996e7-utilities\") pod \"redhat-operators-fskng\" (UID: \"68893526-5b68-42b3-8711-a11fed5996e7\") " pod="openshift-marketplace/redhat-operators-fskng" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.917732 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68893526-5b68-42b3-8711-a11fed5996e7-catalog-content\") pod \"redhat-operators-fskng\" (UID: \"68893526-5b68-42b3-8711-a11fed5996e7\") " pod="openshift-marketplace/redhat-operators-fskng" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.928316 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d91d0313-de7a-4ba3-85c6-0b96327f9ee2-kube-api-access-654fb" (OuterVolumeSpecName: "kube-api-access-654fb") pod "d91d0313-de7a-4ba3-85c6-0b96327f9ee2" (UID: "d91d0313-de7a-4ba3-85c6-0b96327f9ee2"). InnerVolumeSpecName "kube-api-access-654fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.928739 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d91d0313-de7a-4ba3-85c6-0b96327f9ee2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d91d0313-de7a-4ba3-85c6-0b96327f9ee2" (UID: "d91d0313-de7a-4ba3-85c6-0b96327f9ee2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:48:54 crc kubenswrapper[4786]: I1002 06:48:54.934104 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbsv6\" (UniqueName: \"kubernetes.io/projected/68893526-5b68-42b3-8711-a11fed5996e7-kube-api-access-kbsv6\") pod \"redhat-operators-fskng\" (UID: \"68893526-5b68-42b3-8711-a11fed5996e7\") " pod="openshift-marketplace/redhat-operators-fskng" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.018153 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-654fb\" (UniqueName: \"kubernetes.io/projected/d91d0313-de7a-4ba3-85c6-0b96327f9ee2-kube-api-access-654fb\") on node \"crc\" DevicePath \"\"" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.018401 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d91d0313-de7a-4ba3-85c6-0b96327f9ee2-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.018411 4786 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d91d0313-de7a-4ba3-85c6-0b96327f9ee2-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.093321 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fskng" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.157519 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4vp67"] Oct 02 06:48:55 crc kubenswrapper[4786]: E1002 06:48:55.157817 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d91d0313-de7a-4ba3-85c6-0b96327f9ee2" containerName="collect-profiles" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.157837 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d91d0313-de7a-4ba3-85c6-0b96327f9ee2" containerName="collect-profiles" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.157944 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d91d0313-de7a-4ba3-85c6-0b96327f9ee2" containerName="collect-profiles" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.158579 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4vp67" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.162858 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4vp67"] Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.220382 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h57qp\" (UniqueName: \"kubernetes.io/projected/5f3cfa4b-61d4-4b77-9dee-0a0595130ee8-kube-api-access-h57qp\") pod \"redhat-operators-4vp67\" (UID: \"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8\") " pod="openshift-marketplace/redhat-operators-4vp67" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.220464 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f3cfa4b-61d4-4b77-9dee-0a0595130ee8-catalog-content\") pod \"redhat-operators-4vp67\" (UID: \"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8\") " pod="openshift-marketplace/redhat-operators-4vp67" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.220494 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f3cfa4b-61d4-4b77-9dee-0a0595130ee8-utilities\") pod \"redhat-operators-4vp67\" (UID: \"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8\") " pod="openshift-marketplace/redhat-operators-4vp67" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.292661 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fskng"] Oct 02 06:48:55 crc kubenswrapper[4786]: W1002 06:48:55.298125 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68893526_5b68_42b3_8711_a11fed5996e7.slice/crio-567fd823aea2f0c68eb2febb5f9672715cfa022f3a832531ae6517fbb2921814 WatchSource:0}: Error finding container 567fd823aea2f0c68eb2febb5f9672715cfa022f3a832531ae6517fbb2921814: Status 404 returned error can't find the container with id 567fd823aea2f0c68eb2febb5f9672715cfa022f3a832531ae6517fbb2921814 Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.321382 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h57qp\" (UniqueName: \"kubernetes.io/projected/5f3cfa4b-61d4-4b77-9dee-0a0595130ee8-kube-api-access-h57qp\") pod \"redhat-operators-4vp67\" (UID: \"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8\") " pod="openshift-marketplace/redhat-operators-4vp67" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.321443 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f3cfa4b-61d4-4b77-9dee-0a0595130ee8-catalog-content\") pod \"redhat-operators-4vp67\" (UID: \"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8\") " pod="openshift-marketplace/redhat-operators-4vp67" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.321470 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f3cfa4b-61d4-4b77-9dee-0a0595130ee8-utilities\") pod \"redhat-operators-4vp67\" (UID: \"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8\") " pod="openshift-marketplace/redhat-operators-4vp67" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.321873 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f3cfa4b-61d4-4b77-9dee-0a0595130ee8-utilities\") pod \"redhat-operators-4vp67\" (UID: \"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8\") " pod="openshift-marketplace/redhat-operators-4vp67" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.321951 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f3cfa4b-61d4-4b77-9dee-0a0595130ee8-catalog-content\") pod \"redhat-operators-4vp67\" (UID: \"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8\") " pod="openshift-marketplace/redhat-operators-4vp67" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.334945 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h57qp\" (UniqueName: \"kubernetes.io/projected/5f3cfa4b-61d4-4b77-9dee-0a0595130ee8-kube-api-access-h57qp\") pod \"redhat-operators-4vp67\" (UID: \"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8\") " pod="openshift-marketplace/redhat-operators-4vp67" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.477882 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4vp67" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.590155 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.594814 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-5f7p6" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.623440 4786 patch_prober.go:28] interesting pod/router-default-5444994796-fxqfj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 06:48:55 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Oct 02 06:48:55 crc kubenswrapper[4786]: [+]process-running ok Oct 02 06:48:55 crc kubenswrapper[4786]: healthz check failed Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.623480 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxqfj" podUID="04d3ac96-87c1-4034-abf4-7d08d218f135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.653367 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd" event={"ID":"d91d0313-de7a-4ba3-85c6-0b96327f9ee2","Type":"ContainerDied","Data":"ef333828f9a5aff669faa13dbb56a801b89ce820d08ffc80202f2ae8be6b7e46"} Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.653810 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.653732 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef333828f9a5aff669faa13dbb56a801b89ce820d08ffc80202f2ae8be6b7e46" Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.664007 4786 generic.go:334] "Generic (PLEG): container finished" podID="787bb954-7ee0-4a9d-ba2b-9f8352adba43" containerID="4c0ba30739ff5deb2bbb5ac195f669cc3a48ae09f5fba665df190e3f392b6d3c" exitCode=0 Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.665407 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghfbw" event={"ID":"787bb954-7ee0-4a9d-ba2b-9f8352adba43","Type":"ContainerDied","Data":"4c0ba30739ff5deb2bbb5ac195f669cc3a48ae09f5fba665df190e3f392b6d3c"} Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.667361 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghfbw" event={"ID":"787bb954-7ee0-4a9d-ba2b-9f8352adba43","Type":"ContainerStarted","Data":"7bd883e0aaacc83b7aa091210fc8c7a1d965af9430f485e809f826b4d14151c9"} Oct 02 06:48:55 crc kubenswrapper[4786]: I1002 06:48:55.669816 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fskng" event={"ID":"68893526-5b68-42b3-8711-a11fed5996e7","Type":"ContainerStarted","Data":"567fd823aea2f0c68eb2febb5f9672715cfa022f3a832531ae6517fbb2921814"} Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.056032 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.056087 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.056136 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.056794 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.060260 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.065103 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.157484 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.160154 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.188050 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.188605 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.188670 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.190629 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.191558 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.259215 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9dbf55c-ce44-4183-a1ed-b2075bbe49a9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c9dbf55c-ce44-4183-a1ed-b2075bbe49a9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.259306 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9dbf55c-ce44-4183-a1ed-b2075bbe49a9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c9dbf55c-ce44-4183-a1ed-b2075bbe49a9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.259816 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.259854 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.261277 4786 patch_prober.go:28] interesting pod/console-f9d7485db-ph6hx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.261315 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ph6hx" podUID="ca2df496-2124-4881-ae15-4fa5a1a4f0ea" containerName="console" probeResult="failure" output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.270416 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-rtc95" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.293870 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.298218 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.304124 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.360338 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9dbf55c-ce44-4183-a1ed-b2075bbe49a9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c9dbf55c-ce44-4183-a1ed-b2075bbe49a9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.360430 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9dbf55c-ce44-4183-a1ed-b2075bbe49a9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c9dbf55c-ce44-4183-a1ed-b2075bbe49a9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.360499 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9dbf55c-ce44-4183-a1ed-b2075bbe49a9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"c9dbf55c-ce44-4183-a1ed-b2075bbe49a9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.380047 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9dbf55c-ce44-4183-a1ed-b2075bbe49a9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"c9dbf55c-ce44-4183-a1ed-b2075bbe49a9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.503920 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.621656 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.623269 4786 patch_prober.go:28] interesting pod/router-default-5444994796-fxqfj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 06:48:56 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Oct 02 06:48:56 crc kubenswrapper[4786]: [+]process-running ok Oct 02 06:48:56 crc kubenswrapper[4786]: healthz check failed Oct 02 06:48:56 crc kubenswrapper[4786]: I1002 06:48:56.623307 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxqfj" podUID="04d3ac96-87c1-4034-abf4-7d08d218f135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:48:57 crc kubenswrapper[4786]: I1002 06:48:57.497831 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 06:48:57 crc kubenswrapper[4786]: I1002 06:48:57.498061 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 06:48:57 crc kubenswrapper[4786]: I1002 06:48:57.625152 4786 patch_prober.go:28] interesting pod/router-default-5444994796-fxqfj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 06:48:57 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Oct 02 06:48:57 crc kubenswrapper[4786]: [+]process-running ok Oct 02 06:48:57 crc kubenswrapper[4786]: healthz check failed Oct 02 06:48:57 crc kubenswrapper[4786]: I1002 06:48:57.625218 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxqfj" podUID="04d3ac96-87c1-4034-abf4-7d08d218f135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.047314 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.082459 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4880b62-d5a5-402a-b285-f6c0ca7e3f45-kubelet-dir\") pod \"f4880b62-d5a5-402a-b285-f6c0ca7e3f45\" (UID: \"f4880b62-d5a5-402a-b285-f6c0ca7e3f45\") " Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.082558 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4880b62-d5a5-402a-b285-f6c0ca7e3f45-kube-api-access\") pod \"f4880b62-d5a5-402a-b285-f6c0ca7e3f45\" (UID: \"f4880b62-d5a5-402a-b285-f6c0ca7e3f45\") " Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.083175 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4880b62-d5a5-402a-b285-f6c0ca7e3f45-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f4880b62-d5a5-402a-b285-f6c0ca7e3f45" (UID: "f4880b62-d5a5-402a-b285-f6c0ca7e3f45"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.083605 4786 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4880b62-d5a5-402a-b285-f6c0ca7e3f45-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.091123 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4880b62-d5a5-402a-b285-f6c0ca7e3f45-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f4880b62-d5a5-402a-b285-f6c0ca7e3f45" (UID: "f4880b62-d5a5-402a-b285-f6c0ca7e3f45"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.184250 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4880b62-d5a5-402a-b285-f6c0ca7e3f45-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 06:48:58 crc kubenswrapper[4786]: W1002 06:48:58.349392 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-0cc5bfb42150b3b8cdba4d286be37247060b9a1ae9f979f5f15f353c28ef1c99 WatchSource:0}: Error finding container 0cc5bfb42150b3b8cdba4d286be37247060b9a1ae9f979f5f15f353c28ef1c99: Status 404 returned error can't find the container with id 0cc5bfb42150b3b8cdba4d286be37247060b9a1ae9f979f5f15f353c28ef1c99 Oct 02 06:48:58 crc kubenswrapper[4786]: W1002 06:48:58.379817 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-cd5025eec8f0ea1667e256c8f45a86ee30ac3a1ae97c5817cabc6a6a148add84 WatchSource:0}: Error finding container cd5025eec8f0ea1667e256c8f45a86ee30ac3a1ae97c5817cabc6a6a148add84: Status 404 returned error can't find the container with id cd5025eec8f0ea1667e256c8f45a86ee30ac3a1ae97c5817cabc6a6a148add84 Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.407357 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4vp67"] Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.407675 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2lcqq" Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.409600 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 06:48:58 crc kubenswrapper[4786]: W1002 06:48:58.422045 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f3cfa4b_61d4_4b77_9dee_0a0595130ee8.slice/crio-40da3ee535f97303d7e9777047c8b05f724b12c49070840895767f6ae02757b1 WatchSource:0}: Error finding container 40da3ee535f97303d7e9777047c8b05f724b12c49070840895767f6ae02757b1: Status 404 returned error can't find the container with id 40da3ee535f97303d7e9777047c8b05f724b12c49070840895767f6ae02757b1 Oct 02 06:48:58 crc kubenswrapper[4786]: W1002 06:48:58.435041 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc9dbf55c_ce44_4183_a1ed_b2075bbe49a9.slice/crio-8001c3fbfef3e7aef2859cbb5b9317c8b84b01827e56c02fc725f0593039aa2a WatchSource:0}: Error finding container 8001c3fbfef3e7aef2859cbb5b9317c8b84b01827e56c02fc725f0593039aa2a: Status 404 returned error can't find the container with id 8001c3fbfef3e7aef2859cbb5b9317c8b84b01827e56c02fc725f0593039aa2a Oct 02 06:48:58 crc kubenswrapper[4786]: W1002 06:48:58.480080 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-6f4153a043ee12db7a3a12ac2272b3d392ab50e4ef9165630819ddc9fe66fe50 WatchSource:0}: Error finding container 6f4153a043ee12db7a3a12ac2272b3d392ab50e4ef9165630819ddc9fe66fe50: Status 404 returned error can't find the container with id 6f4153a043ee12db7a3a12ac2272b3d392ab50e4ef9165630819ddc9fe66fe50 Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.623813 4786 patch_prober.go:28] interesting pod/router-default-5444994796-fxqfj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 06:48:58 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Oct 02 06:48:58 crc kubenswrapper[4786]: [+]process-running ok Oct 02 06:48:58 crc kubenswrapper[4786]: healthz check failed Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.623864 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxqfj" podUID="04d3ac96-87c1-4034-abf4-7d08d218f135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.688016 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f4880b62-d5a5-402a-b285-f6c0ca7e3f45","Type":"ContainerDied","Data":"4030f061f3cade302fa1c28ef28ac4a9401ef61b921e32f4c626105d75d73d5e"} Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.688213 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4030f061f3cade302fa1c28ef28ac4a9401ef61b921e32f4c626105d75d73d5e" Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.688069 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.691390 4786 generic.go:334] "Generic (PLEG): container finished" podID="5f3cfa4b-61d4-4b77-9dee-0a0595130ee8" containerID="e47e4da2ae755ab86c1ea7479876460f2cca159eddbcae7cd61b25c576604567" exitCode=0 Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.691499 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4vp67" event={"ID":"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8","Type":"ContainerDied","Data":"e47e4da2ae755ab86c1ea7479876460f2cca159eddbcae7cd61b25c576604567"} Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.691558 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4vp67" event={"ID":"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8","Type":"ContainerStarted","Data":"40da3ee535f97303d7e9777047c8b05f724b12c49070840895767f6ae02757b1"} Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.693485 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c9dbf55c-ce44-4183-a1ed-b2075bbe49a9","Type":"ContainerStarted","Data":"8001c3fbfef3e7aef2859cbb5b9317c8b84b01827e56c02fc725f0593039aa2a"} Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.700833 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"701c14672dd64db9c79eb6f5200aa8a06ee87af870238394363ff0b54d615344"} Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.700873 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"6f4153a043ee12db7a3a12ac2272b3d392ab50e4ef9165630819ddc9fe66fe50"} Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.702301 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9e7c27d84a8eedf991165d793b00326b4364f04a39734d9aa6fcf2491386c7be"} Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.702327 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"cd5025eec8f0ea1667e256c8f45a86ee30ac3a1ae97c5817cabc6a6a148add84"} Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.704450 4786 generic.go:334] "Generic (PLEG): container finished" podID="68893526-5b68-42b3-8711-a11fed5996e7" containerID="3aa729fa1a4abe971e93a2c5e19a7dba56ac93a833b7c27aaedbfa56749f10a4" exitCode=0 Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.704877 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fskng" event={"ID":"68893526-5b68-42b3-8711-a11fed5996e7","Type":"ContainerDied","Data":"3aa729fa1a4abe971e93a2c5e19a7dba56ac93a833b7c27aaedbfa56749f10a4"} Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.706443 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0941c6a08f61183581e8114e78d2652d5a1a0327c58f81dbef55334bfd272d01"} Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.706492 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0cc5bfb42150b3b8cdba4d286be37247060b9a1ae9f979f5f15f353c28ef1c99"} Oct 02 06:48:58 crc kubenswrapper[4786]: I1002 06:48:58.707276 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:48:59 crc kubenswrapper[4786]: I1002 06:48:59.622907 4786 patch_prober.go:28] interesting pod/router-default-5444994796-fxqfj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 06:48:59 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Oct 02 06:48:59 crc kubenswrapper[4786]: [+]process-running ok Oct 02 06:48:59 crc kubenswrapper[4786]: healthz check failed Oct 02 06:48:59 crc kubenswrapper[4786]: I1002 06:48:59.623135 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxqfj" podUID="04d3ac96-87c1-4034-abf4-7d08d218f135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:48:59 crc kubenswrapper[4786]: I1002 06:48:59.716778 4786 generic.go:334] "Generic (PLEG): container finished" podID="c9dbf55c-ce44-4183-a1ed-b2075bbe49a9" containerID="a226f1eaf9049e7d502b064f9172bf7a32f371246db294c2606b051f6366f76f" exitCode=0 Oct 02 06:48:59 crc kubenswrapper[4786]: I1002 06:48:59.716901 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c9dbf55c-ce44-4183-a1ed-b2075bbe49a9","Type":"ContainerDied","Data":"a226f1eaf9049e7d502b064f9172bf7a32f371246db294c2606b051f6366f76f"} Oct 02 06:49:00 crc kubenswrapper[4786]: I1002 06:49:00.622277 4786 patch_prober.go:28] interesting pod/router-default-5444994796-fxqfj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 06:49:00 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Oct 02 06:49:00 crc kubenswrapper[4786]: [+]process-running ok Oct 02 06:49:00 crc kubenswrapper[4786]: healthz check failed Oct 02 06:49:00 crc kubenswrapper[4786]: I1002 06:49:00.622322 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxqfj" podUID="04d3ac96-87c1-4034-abf4-7d08d218f135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:49:01 crc kubenswrapper[4786]: I1002 06:49:01.622713 4786 patch_prober.go:28] interesting pod/router-default-5444994796-fxqfj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 06:49:01 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Oct 02 06:49:01 crc kubenswrapper[4786]: [+]process-running ok Oct 02 06:49:01 crc kubenswrapper[4786]: healthz check failed Oct 02 06:49:01 crc kubenswrapper[4786]: I1002 06:49:01.622917 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxqfj" podUID="04d3ac96-87c1-4034-abf4-7d08d218f135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:49:01 crc kubenswrapper[4786]: I1002 06:49:01.799324 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 06:49:01 crc kubenswrapper[4786]: I1002 06:49:01.925312 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9dbf55c-ce44-4183-a1ed-b2075bbe49a9-kubelet-dir\") pod \"c9dbf55c-ce44-4183-a1ed-b2075bbe49a9\" (UID: \"c9dbf55c-ce44-4183-a1ed-b2075bbe49a9\") " Oct 02 06:49:01 crc kubenswrapper[4786]: I1002 06:49:01.925369 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9dbf55c-ce44-4183-a1ed-b2075bbe49a9-kube-api-access\") pod \"c9dbf55c-ce44-4183-a1ed-b2075bbe49a9\" (UID: \"c9dbf55c-ce44-4183-a1ed-b2075bbe49a9\") " Oct 02 06:49:01 crc kubenswrapper[4786]: I1002 06:49:01.925391 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9dbf55c-ce44-4183-a1ed-b2075bbe49a9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c9dbf55c-ce44-4183-a1ed-b2075bbe49a9" (UID: "c9dbf55c-ce44-4183-a1ed-b2075bbe49a9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 06:49:01 crc kubenswrapper[4786]: I1002 06:49:01.925755 4786 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9dbf55c-ce44-4183-a1ed-b2075bbe49a9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:01 crc kubenswrapper[4786]: I1002 06:49:01.932845 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9dbf55c-ce44-4183-a1ed-b2075bbe49a9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c9dbf55c-ce44-4183-a1ed-b2075bbe49a9" (UID: "c9dbf55c-ce44-4183-a1ed-b2075bbe49a9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:49:02 crc kubenswrapper[4786]: I1002 06:49:02.027219 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9dbf55c-ce44-4183-a1ed-b2075bbe49a9-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:02 crc kubenswrapper[4786]: I1002 06:49:02.622515 4786 patch_prober.go:28] interesting pod/router-default-5444994796-fxqfj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 06:49:02 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Oct 02 06:49:02 crc kubenswrapper[4786]: [+]process-running ok Oct 02 06:49:02 crc kubenswrapper[4786]: healthz check failed Oct 02 06:49:02 crc kubenswrapper[4786]: I1002 06:49:02.622580 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxqfj" podUID="04d3ac96-87c1-4034-abf4-7d08d218f135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:49:02 crc kubenswrapper[4786]: I1002 06:49:02.742274 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"c9dbf55c-ce44-4183-a1ed-b2075bbe49a9","Type":"ContainerDied","Data":"8001c3fbfef3e7aef2859cbb5b9317c8b84b01827e56c02fc725f0593039aa2a"} Oct 02 06:49:02 crc kubenswrapper[4786]: I1002 06:49:02.742511 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8001c3fbfef3e7aef2859cbb5b9317c8b84b01827e56c02fc725f0593039aa2a" Oct 02 06:49:02 crc kubenswrapper[4786]: I1002 06:49:02.742303 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 06:49:03 crc kubenswrapper[4786]: I1002 06:49:03.622875 4786 patch_prober.go:28] interesting pod/router-default-5444994796-fxqfj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 06:49:03 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Oct 02 06:49:03 crc kubenswrapper[4786]: [+]process-running ok Oct 02 06:49:03 crc kubenswrapper[4786]: healthz check failed Oct 02 06:49:03 crc kubenswrapper[4786]: I1002 06:49:03.623090 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxqfj" podUID="04d3ac96-87c1-4034-abf4-7d08d218f135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:49:03 crc kubenswrapper[4786]: I1002 06:49:03.958250 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:49:04 crc kubenswrapper[4786]: I1002 06:49:04.623143 4786 patch_prober.go:28] interesting pod/router-default-5444994796-fxqfj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 06:49:04 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Oct 02 06:49:04 crc kubenswrapper[4786]: [+]process-running ok Oct 02 06:49:04 crc kubenswrapper[4786]: healthz check failed Oct 02 06:49:04 crc kubenswrapper[4786]: I1002 06:49:04.623205 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxqfj" podUID="04d3ac96-87c1-4034-abf4-7d08d218f135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:49:05 crc kubenswrapper[4786]: I1002 06:49:05.622405 4786 patch_prober.go:28] interesting pod/router-default-5444994796-fxqfj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 06:49:05 crc kubenswrapper[4786]: [-]has-synced failed: reason withheld Oct 02 06:49:05 crc kubenswrapper[4786]: [+]process-running ok Oct 02 06:49:05 crc kubenswrapper[4786]: healthz check failed Oct 02 06:49:05 crc kubenswrapper[4786]: I1002 06:49:05.622633 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxqfj" podUID="04d3ac96-87c1-4034-abf4-7d08d218f135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:49:06 crc kubenswrapper[4786]: I1002 06:49:06.259975 4786 patch_prober.go:28] interesting pod/console-f9d7485db-ph6hx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Oct 02 06:49:06 crc kubenswrapper[4786]: I1002 06:49:06.260014 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ph6hx" podUID="ca2df496-2124-4881-ae15-4fa5a1a4f0ea" containerName="console" probeResult="failure" output="Get \"https://10.217.0.32:8443/health\": dial tcp 10.217.0.32:8443: connect: connection refused" Oct 02 06:49:06 crc kubenswrapper[4786]: I1002 06:49:06.623036 4786 patch_prober.go:28] interesting pod/router-default-5444994796-fxqfj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 06:49:06 crc kubenswrapper[4786]: [+]has-synced ok Oct 02 06:49:06 crc kubenswrapper[4786]: [+]process-running ok Oct 02 06:49:06 crc kubenswrapper[4786]: healthz check failed Oct 02 06:49:06 crc kubenswrapper[4786]: I1002 06:49:06.623085 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fxqfj" podUID="04d3ac96-87c1-4034-abf4-7d08d218f135" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 06:49:07 crc kubenswrapper[4786]: I1002 06:49:07.622953 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:49:07 crc kubenswrapper[4786]: I1002 06:49:07.624862 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-fxqfj" Oct 02 06:49:08 crc kubenswrapper[4786]: I1002 06:49:08.915516 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs\") pod \"network-metrics-daemon-p8zkp\" (UID: \"6e4217c0-9581-4727-b594-adb99293f7db\") " pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:49:08 crc kubenswrapper[4786]: I1002 06:49:08.921839 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e4217c0-9581-4727-b594-adb99293f7db-metrics-certs\") pod \"network-metrics-daemon-p8zkp\" (UID: \"6e4217c0-9581-4727-b594-adb99293f7db\") " pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:49:09 crc kubenswrapper[4786]: I1002 06:49:09.209504 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p8zkp" Oct 02 06:49:11 crc kubenswrapper[4786]: I1002 06:49:11.606242 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:49:12 crc kubenswrapper[4786]: I1002 06:49:12.806135 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nj4j" event={"ID":"1638ccfb-4503-4402-bec5-e1f8a9588c38","Type":"ContainerStarted","Data":"c6ff6758968fa0fb8db35fdd2a0711782d290f08f2c9067aa688616546b4f368"} Oct 02 06:49:12 crc kubenswrapper[4786]: I1002 06:49:12.808461 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4vp67" event={"ID":"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8","Type":"ContainerStarted","Data":"a50321949b315a02e03ea1f223887e181c5df077a21668c2fb0ba383d897ad25"} Oct 02 06:49:12 crc kubenswrapper[4786]: I1002 06:49:12.809849 4786 generic.go:334] "Generic (PLEG): container finished" podID="074fa372-0ddd-47ea-a3ad-9203d7574875" containerID="06d3706c193a9e355a46128d6629311219a9aa8ed843de7c88b4d08c6cec3319" exitCode=0 Oct 02 06:49:12 crc kubenswrapper[4786]: I1002 06:49:12.809908 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9mns" event={"ID":"074fa372-0ddd-47ea-a3ad-9203d7574875","Type":"ContainerDied","Data":"06d3706c193a9e355a46128d6629311219a9aa8ed843de7c88b4d08c6cec3319"} Oct 02 06:49:12 crc kubenswrapper[4786]: I1002 06:49:12.811189 4786 generic.go:334] "Generic (PLEG): container finished" podID="dfdbf275-781b-4a7d-b943-b592d682d11a" containerID="a584ecdd33a86d03371aa96861262470eb05eff630e58cf89e473a36a838b1ba" exitCode=0 Oct 02 06:49:12 crc kubenswrapper[4786]: I1002 06:49:12.811233 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbkbh" event={"ID":"dfdbf275-781b-4a7d-b943-b592d682d11a","Type":"ContainerDied","Data":"a584ecdd33a86d03371aa96861262470eb05eff630e58cf89e473a36a838b1ba"} Oct 02 06:49:12 crc kubenswrapper[4786]: I1002 06:49:12.813629 4786 generic.go:334] "Generic (PLEG): container finished" podID="787bb954-7ee0-4a9d-ba2b-9f8352adba43" containerID="9a686115ec75de794c253d8339f16a7b270b84607059249dc6a3c884817f52de" exitCode=0 Oct 02 06:49:12 crc kubenswrapper[4786]: I1002 06:49:12.813678 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghfbw" event={"ID":"787bb954-7ee0-4a9d-ba2b-9f8352adba43","Type":"ContainerDied","Data":"9a686115ec75de794c253d8339f16a7b270b84607059249dc6a3c884817f52de"} Oct 02 06:49:12 crc kubenswrapper[4786]: I1002 06:49:12.815813 4786 generic.go:334] "Generic (PLEG): container finished" podID="1e96c39a-5626-40ce-ad0f-f455c3292478" containerID="555e6b6a02e283e17d785044506ec3828d2625c5569b27d0e630b89dc6e3b4b2" exitCode=0 Oct 02 06:49:12 crc kubenswrapper[4786]: I1002 06:49:12.815855 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfgkp" event={"ID":"1e96c39a-5626-40ce-ad0f-f455c3292478","Type":"ContainerDied","Data":"555e6b6a02e283e17d785044506ec3828d2625c5569b27d0e630b89dc6e3b4b2"} Oct 02 06:49:12 crc kubenswrapper[4786]: I1002 06:49:12.818236 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fskng" event={"ID":"68893526-5b68-42b3-8711-a11fed5996e7","Type":"ContainerStarted","Data":"49c0a6809cec6adfd4e11db3a27220afcc559fc7196c863c06523e177035d57c"} Oct 02 06:49:12 crc kubenswrapper[4786]: I1002 06:49:12.821608 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkn49" event={"ID":"0929cab2-9dd2-42c2-a91f-1b98bf72ace3","Type":"ContainerStarted","Data":"679ab8e580db5ed21477bd5af6bc58a29e2fb57925ecd32495eb4c861b712cc5"} Oct 02 06:49:12 crc kubenswrapper[4786]: I1002 06:49:12.840704 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p8zkp"] Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.831468 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbkbh" event={"ID":"dfdbf275-781b-4a7d-b943-b592d682d11a","Type":"ContainerStarted","Data":"63cc0f8857c2d29511339f1c7b029a35cf3683f8df0ecab73566b9851bad0478"} Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.834749 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghfbw" event={"ID":"787bb954-7ee0-4a9d-ba2b-9f8352adba43","Type":"ContainerStarted","Data":"cc783b0b470b99ccad4e8cd7c2f97c991ad08291cd9bc095b974a65655cae615"} Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.837291 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfgkp" event={"ID":"1e96c39a-5626-40ce-ad0f-f455c3292478","Type":"ContainerStarted","Data":"df315a89e6f3c18ba01c3a535ac0edd84e8f4acf160b67b2fcd346dac23da72a"} Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.838766 4786 generic.go:334] "Generic (PLEG): container finished" podID="0929cab2-9dd2-42c2-a91f-1b98bf72ace3" containerID="679ab8e580db5ed21477bd5af6bc58a29e2fb57925ecd32495eb4c861b712cc5" exitCode=0 Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.838813 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkn49" event={"ID":"0929cab2-9dd2-42c2-a91f-1b98bf72ace3","Type":"ContainerDied","Data":"679ab8e580db5ed21477bd5af6bc58a29e2fb57925ecd32495eb4c861b712cc5"} Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.841213 4786 generic.go:334] "Generic (PLEG): container finished" podID="1638ccfb-4503-4402-bec5-e1f8a9588c38" containerID="c6ff6758968fa0fb8db35fdd2a0711782d290f08f2c9067aa688616546b4f368" exitCode=0 Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.841261 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nj4j" event={"ID":"1638ccfb-4503-4402-bec5-e1f8a9588c38","Type":"ContainerDied","Data":"c6ff6758968fa0fb8db35fdd2a0711782d290f08f2c9067aa688616546b4f368"} Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.850075 4786 generic.go:334] "Generic (PLEG): container finished" podID="5f3cfa4b-61d4-4b77-9dee-0a0595130ee8" containerID="a50321949b315a02e03ea1f223887e181c5df077a21668c2fb0ba383d897ad25" exitCode=0 Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.850245 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4vp67" event={"ID":"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8","Type":"ContainerDied","Data":"a50321949b315a02e03ea1f223887e181c5df077a21668c2fb0ba383d897ad25"} Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.852540 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9mns" event={"ID":"074fa372-0ddd-47ea-a3ad-9203d7574875","Type":"ContainerStarted","Data":"904400a2901e0309e54f5aad36d44946706b89e0fc39c2118affb96fd53457ec"} Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.855286 4786 generic.go:334] "Generic (PLEG): container finished" podID="68893526-5b68-42b3-8711-a11fed5996e7" containerID="49c0a6809cec6adfd4e11db3a27220afcc559fc7196c863c06523e177035d57c" exitCode=0 Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.855334 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fskng" event={"ID":"68893526-5b68-42b3-8711-a11fed5996e7","Type":"ContainerDied","Data":"49c0a6809cec6adfd4e11db3a27220afcc559fc7196c863c06523e177035d57c"} Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.855377 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tbkbh" podStartSLOduration=2.171966204 podStartE2EDuration="20.855367759s" podCreationTimestamp="2025-10-02 06:48:53 +0000 UTC" firstStartedPulling="2025-10-02 06:48:54.609814044 +0000 UTC m=+144.730997175" lastFinishedPulling="2025-10-02 06:49:13.293215599 +0000 UTC m=+163.414398730" observedRunningTime="2025-10-02 06:49:13.852223204 +0000 UTC m=+163.973406345" watchObservedRunningTime="2025-10-02 06:49:13.855367759 +0000 UTC m=+163.976550890" Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.857147 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" event={"ID":"6e4217c0-9581-4727-b594-adb99293f7db","Type":"ContainerStarted","Data":"781c1294c079ae7810b7dcf14cb8605caa0e9fbccf7ba5b6c5bb437f4a3eb6a4"} Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.857170 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" event={"ID":"6e4217c0-9581-4727-b594-adb99293f7db","Type":"ContainerStarted","Data":"853e33ef6c0e81765eae619c6a708d6a408b149b1e82ab38d14e7bfda3580b55"} Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.857179 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p8zkp" event={"ID":"6e4217c0-9581-4727-b594-adb99293f7db","Type":"ContainerStarted","Data":"5c0e40050c5d408845ea871d119d42789e1bcc59c864f8d5e986762f50799cd3"} Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.897524 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rfgkp" podStartSLOduration=2.087916776 podStartE2EDuration="22.897510145s" podCreationTimestamp="2025-10-02 06:48:51 +0000 UTC" firstStartedPulling="2025-10-02 06:48:52.566858744 +0000 UTC m=+142.688041875" lastFinishedPulling="2025-10-02 06:49:13.376452113 +0000 UTC m=+163.497635244" observedRunningTime="2025-10-02 06:49:13.896209659 +0000 UTC m=+164.017392800" watchObservedRunningTime="2025-10-02 06:49:13.897510145 +0000 UTC m=+164.018693276" Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.929815 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ghfbw" podStartSLOduration=4.431476618 podStartE2EDuration="19.929799935s" podCreationTimestamp="2025-10-02 06:48:54 +0000 UTC" firstStartedPulling="2025-10-02 06:48:57.916188069 +0000 UTC m=+148.037371200" lastFinishedPulling="2025-10-02 06:49:13.414511386 +0000 UTC m=+163.535694517" observedRunningTime="2025-10-02 06:49:13.928240788 +0000 UTC m=+164.049423949" watchObservedRunningTime="2025-10-02 06:49:13.929799935 +0000 UTC m=+164.050983067" Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.955604 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-p8zkp" podStartSLOduration=146.955589558 podStartE2EDuration="2m26.955589558s" podCreationTimestamp="2025-10-02 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:49:13.940345937 +0000 UTC m=+164.061529068" watchObservedRunningTime="2025-10-02 06:49:13.955589558 +0000 UTC m=+164.076772689" Oct 02 06:49:13 crc kubenswrapper[4786]: I1002 06:49:13.967105 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c9mns" podStartSLOduration=2.206467297 podStartE2EDuration="22.967092164s" podCreationTimestamp="2025-10-02 06:48:51 +0000 UTC" firstStartedPulling="2025-10-02 06:48:52.561999057 +0000 UTC m=+142.683182189" lastFinishedPulling="2025-10-02 06:49:13.322623924 +0000 UTC m=+163.443807056" observedRunningTime="2025-10-02 06:49:13.96674227 +0000 UTC m=+164.087925411" watchObservedRunningTime="2025-10-02 06:49:13.967092164 +0000 UTC m=+164.088275285" Oct 02 06:49:14 crc kubenswrapper[4786]: I1002 06:49:14.066744 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tbkbh" Oct 02 06:49:14 crc kubenswrapper[4786]: I1002 06:49:14.066981 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tbkbh" Oct 02 06:49:14 crc kubenswrapper[4786]: I1002 06:49:14.466574 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ghfbw" Oct 02 06:49:14 crc kubenswrapper[4786]: I1002 06:49:14.466785 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ghfbw" Oct 02 06:49:14 crc kubenswrapper[4786]: I1002 06:49:14.512804 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ghfbw" Oct 02 06:49:14 crc kubenswrapper[4786]: I1002 06:49:14.864089 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkn49" event={"ID":"0929cab2-9dd2-42c2-a91f-1b98bf72ace3","Type":"ContainerStarted","Data":"4b5b8b957b81f0ec9bae35c2e93f4d0bb5533c11a74e67fc492da68fb8436ba9"} Oct 02 06:49:14 crc kubenswrapper[4786]: I1002 06:49:14.866348 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nj4j" event={"ID":"1638ccfb-4503-4402-bec5-e1f8a9588c38","Type":"ContainerStarted","Data":"30b88e747f012f732fc6078d8ef0a63a2874d99ac7b632650245fbcf458a6549"} Oct 02 06:49:14 crc kubenswrapper[4786]: I1002 06:49:14.868092 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4vp67" event={"ID":"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8","Type":"ContainerStarted","Data":"9eb478c7f5003a65a4128efe57f8ae42f63c0b7d8747b464cd0b015d0e5f0d7f"} Oct 02 06:49:14 crc kubenswrapper[4786]: I1002 06:49:14.870099 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fskng" event={"ID":"68893526-5b68-42b3-8711-a11fed5996e7","Type":"ContainerStarted","Data":"ab380e677f2d8e6d2ab374d2847c782acf84eb5b86f73694dec03c35320ad369"} Oct 02 06:49:14 crc kubenswrapper[4786]: I1002 06:49:14.879199 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fkn49" podStartSLOduration=2.089165123 podStartE2EDuration="23.879184876s" podCreationTimestamp="2025-10-02 06:48:51 +0000 UTC" firstStartedPulling="2025-10-02 06:48:52.569912686 +0000 UTC m=+142.691095817" lastFinishedPulling="2025-10-02 06:49:14.359932438 +0000 UTC m=+164.481115570" observedRunningTime="2025-10-02 06:49:14.878662435 +0000 UTC m=+164.999845576" watchObservedRunningTime="2025-10-02 06:49:14.879184876 +0000 UTC m=+165.000368007" Oct 02 06:49:14 crc kubenswrapper[4786]: I1002 06:49:14.896618 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6nj4j" podStartSLOduration=2.133381538 podStartE2EDuration="22.896603321s" podCreationTimestamp="2025-10-02 06:48:52 +0000 UTC" firstStartedPulling="2025-10-02 06:48:53.59515706 +0000 UTC m=+143.716340191" lastFinishedPulling="2025-10-02 06:49:14.358378842 +0000 UTC m=+164.479561974" observedRunningTime="2025-10-02 06:49:14.893793482 +0000 UTC m=+165.014976614" watchObservedRunningTime="2025-10-02 06:49:14.896603321 +0000 UTC m=+165.017786452" Oct 02 06:49:14 crc kubenswrapper[4786]: I1002 06:49:14.906061 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fskng" podStartSLOduration=5.110799262 podStartE2EDuration="20.906047833s" podCreationTimestamp="2025-10-02 06:48:54 +0000 UTC" firstStartedPulling="2025-10-02 06:48:58.705607831 +0000 UTC m=+148.826790963" lastFinishedPulling="2025-10-02 06:49:14.500856403 +0000 UTC m=+164.622039534" observedRunningTime="2025-10-02 06:49:14.905072564 +0000 UTC m=+165.026255705" watchObservedRunningTime="2025-10-02 06:49:14.906047833 +0000 UTC m=+165.027230964" Oct 02 06:49:14 crc kubenswrapper[4786]: I1002 06:49:14.919217 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4vp67" podStartSLOduration=4.274321205 podStartE2EDuration="19.919202793s" podCreationTimestamp="2025-10-02 06:48:55 +0000 UTC" firstStartedPulling="2025-10-02 06:48:58.692618686 +0000 UTC m=+148.813801817" lastFinishedPulling="2025-10-02 06:49:14.337500275 +0000 UTC m=+164.458683405" observedRunningTime="2025-10-02 06:49:14.916800177 +0000 UTC m=+165.037983318" watchObservedRunningTime="2025-10-02 06:49:14.919202793 +0000 UTC m=+165.040385925" Oct 02 06:49:15 crc kubenswrapper[4786]: I1002 06:49:15.094418 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fskng" Oct 02 06:49:15 crc kubenswrapper[4786]: I1002 06:49:15.094460 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fskng" Oct 02 06:49:15 crc kubenswrapper[4786]: I1002 06:49:15.132395 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-tbkbh" podUID="dfdbf275-781b-4a7d-b943-b592d682d11a" containerName="registry-server" probeResult="failure" output=< Oct 02 06:49:15 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Oct 02 06:49:15 crc kubenswrapper[4786]: > Oct 02 06:49:15 crc kubenswrapper[4786]: I1002 06:49:15.478932 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4vp67" Oct 02 06:49:15 crc kubenswrapper[4786]: I1002 06:49:15.479206 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4vp67" Oct 02 06:49:16 crc kubenswrapper[4786]: I1002 06:49:16.120414 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fskng" podUID="68893526-5b68-42b3-8711-a11fed5996e7" containerName="registry-server" probeResult="failure" output=< Oct 02 06:49:16 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Oct 02 06:49:16 crc kubenswrapper[4786]: > Oct 02 06:49:16 crc kubenswrapper[4786]: I1002 06:49:16.266721 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:49:16 crc kubenswrapper[4786]: I1002 06:49:16.271037 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:49:16 crc kubenswrapper[4786]: I1002 06:49:16.504178 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4vp67" podUID="5f3cfa4b-61d4-4b77-9dee-0a0595130ee8" containerName="registry-server" probeResult="failure" output=< Oct 02 06:49:16 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Oct 02 06:49:16 crc kubenswrapper[4786]: > Oct 02 06:49:21 crc kubenswrapper[4786]: I1002 06:49:21.864492 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fkn49" Oct 02 06:49:21 crc kubenswrapper[4786]: I1002 06:49:21.864961 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fkn49" Oct 02 06:49:21 crc kubenswrapper[4786]: I1002 06:49:21.895112 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fkn49" Oct 02 06:49:21 crc kubenswrapper[4786]: I1002 06:49:21.927029 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fkn49" Oct 02 06:49:22 crc kubenswrapper[4786]: I1002 06:49:22.064029 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c9mns" Oct 02 06:49:22 crc kubenswrapper[4786]: I1002 06:49:22.064071 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c9mns" Oct 02 06:49:22 crc kubenswrapper[4786]: I1002 06:49:22.090310 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c9mns" Oct 02 06:49:22 crc kubenswrapper[4786]: I1002 06:49:22.267841 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rfgkp" Oct 02 06:49:22 crc kubenswrapper[4786]: I1002 06:49:22.267913 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rfgkp" Oct 02 06:49:22 crc kubenswrapper[4786]: I1002 06:49:22.297906 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rfgkp" Oct 02 06:49:22 crc kubenswrapper[4786]: I1002 06:49:22.522891 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6nj4j" Oct 02 06:49:22 crc kubenswrapper[4786]: I1002 06:49:22.523175 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6nj4j" Oct 02 06:49:22 crc kubenswrapper[4786]: I1002 06:49:22.549683 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6nj4j" Oct 02 06:49:22 crc kubenswrapper[4786]: I1002 06:49:22.930925 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c9mns" Oct 02 06:49:22 crc kubenswrapper[4786]: I1002 06:49:22.930977 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6nj4j" Oct 02 06:49:22 crc kubenswrapper[4786]: I1002 06:49:22.935528 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rfgkp" Oct 02 06:49:23 crc kubenswrapper[4786]: I1002 06:49:23.844665 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rfgkp"] Oct 02 06:49:23 crc kubenswrapper[4786]: I1002 06:49:23.962667 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-txfb6"] Oct 02 06:49:24 crc kubenswrapper[4786]: I1002 06:49:24.108665 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tbkbh" Oct 02 06:49:24 crc kubenswrapper[4786]: I1002 06:49:24.161256 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tbkbh" Oct 02 06:49:24 crc kubenswrapper[4786]: I1002 06:49:24.439743 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6nj4j"] Oct 02 06:49:24 crc kubenswrapper[4786]: I1002 06:49:24.493107 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ghfbw" Oct 02 06:49:24 crc kubenswrapper[4786]: I1002 06:49:24.920591 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rfgkp" podUID="1e96c39a-5626-40ce-ad0f-f455c3292478" containerName="registry-server" containerID="cri-o://df315a89e6f3c18ba01c3a535ac0edd84e8f4acf160b67b2fcd346dac23da72a" gracePeriod=2 Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.125746 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fskng" Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.163100 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fskng" Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.228274 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rfgkp" Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.398570 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e96c39a-5626-40ce-ad0f-f455c3292478-utilities\") pod \"1e96c39a-5626-40ce-ad0f-f455c3292478\" (UID: \"1e96c39a-5626-40ce-ad0f-f455c3292478\") " Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.398713 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e96c39a-5626-40ce-ad0f-f455c3292478-catalog-content\") pod \"1e96c39a-5626-40ce-ad0f-f455c3292478\" (UID: \"1e96c39a-5626-40ce-ad0f-f455c3292478\") " Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.398761 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g468t\" (UniqueName: \"kubernetes.io/projected/1e96c39a-5626-40ce-ad0f-f455c3292478-kube-api-access-g468t\") pod \"1e96c39a-5626-40ce-ad0f-f455c3292478\" (UID: \"1e96c39a-5626-40ce-ad0f-f455c3292478\") " Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.399271 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e96c39a-5626-40ce-ad0f-f455c3292478-utilities" (OuterVolumeSpecName: "utilities") pod "1e96c39a-5626-40ce-ad0f-f455c3292478" (UID: "1e96c39a-5626-40ce-ad0f-f455c3292478"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.415464 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e96c39a-5626-40ce-ad0f-f455c3292478-kube-api-access-g468t" (OuterVolumeSpecName: "kube-api-access-g468t") pod "1e96c39a-5626-40ce-ad0f-f455c3292478" (UID: "1e96c39a-5626-40ce-ad0f-f455c3292478"). InnerVolumeSpecName "kube-api-access-g468t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.434100 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e96c39a-5626-40ce-ad0f-f455c3292478-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e96c39a-5626-40ce-ad0f-f455c3292478" (UID: "1e96c39a-5626-40ce-ad0f-f455c3292478"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.500258 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g468t\" (UniqueName: \"kubernetes.io/projected/1e96c39a-5626-40ce-ad0f-f455c3292478-kube-api-access-g468t\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.500288 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e96c39a-5626-40ce-ad0f-f455c3292478-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.500298 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e96c39a-5626-40ce-ad0f-f455c3292478-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.506211 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4vp67" Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.531570 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4vp67" Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.926663 4786 generic.go:334] "Generic (PLEG): container finished" podID="1e96c39a-5626-40ce-ad0f-f455c3292478" containerID="df315a89e6f3c18ba01c3a535ac0edd84e8f4acf160b67b2fcd346dac23da72a" exitCode=0 Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.927160 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rfgkp" Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.933184 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfgkp" event={"ID":"1e96c39a-5626-40ce-ad0f-f455c3292478","Type":"ContainerDied","Data":"df315a89e6f3c18ba01c3a535ac0edd84e8f4acf160b67b2fcd346dac23da72a"} Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.933236 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rfgkp" event={"ID":"1e96c39a-5626-40ce-ad0f-f455c3292478","Type":"ContainerDied","Data":"391b25f705b004c1900538ca914962f65bb0b1d21385ad701b6b15e550b51727"} Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.933259 4786 scope.go:117] "RemoveContainer" containerID="df315a89e6f3c18ba01c3a535ac0edd84e8f4acf160b67b2fcd346dac23da72a" Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.933613 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6nj4j" podUID="1638ccfb-4503-4402-bec5-e1f8a9588c38" containerName="registry-server" containerID="cri-o://30b88e747f012f732fc6078d8ef0a63a2874d99ac7b632650245fbcf458a6549" gracePeriod=2 Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.948274 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rfgkp"] Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.950172 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rfgkp"] Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.952344 4786 scope.go:117] "RemoveContainer" containerID="555e6b6a02e283e17d785044506ec3828d2625c5569b27d0e630b89dc6e3b4b2" Oct 02 06:49:25 crc kubenswrapper[4786]: I1002 06:49:25.965549 4786 scope.go:117] "RemoveContainer" containerID="c0a47247e51ecec08678e63ce31aaf75c0f3439365da3ac9de4a6dc8fa0712b2" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.049395 4786 scope.go:117] "RemoveContainer" containerID="df315a89e6f3c18ba01c3a535ac0edd84e8f4acf160b67b2fcd346dac23da72a" Oct 02 06:49:26 crc kubenswrapper[4786]: E1002 06:49:26.049783 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df315a89e6f3c18ba01c3a535ac0edd84e8f4acf160b67b2fcd346dac23da72a\": container with ID starting with df315a89e6f3c18ba01c3a535ac0edd84e8f4acf160b67b2fcd346dac23da72a not found: ID does not exist" containerID="df315a89e6f3c18ba01c3a535ac0edd84e8f4acf160b67b2fcd346dac23da72a" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.049818 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df315a89e6f3c18ba01c3a535ac0edd84e8f4acf160b67b2fcd346dac23da72a"} err="failed to get container status \"df315a89e6f3c18ba01c3a535ac0edd84e8f4acf160b67b2fcd346dac23da72a\": rpc error: code = NotFound desc = could not find container \"df315a89e6f3c18ba01c3a535ac0edd84e8f4acf160b67b2fcd346dac23da72a\": container with ID starting with df315a89e6f3c18ba01c3a535ac0edd84e8f4acf160b67b2fcd346dac23da72a not found: ID does not exist" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.049852 4786 scope.go:117] "RemoveContainer" containerID="555e6b6a02e283e17d785044506ec3828d2625c5569b27d0e630b89dc6e3b4b2" Oct 02 06:49:26 crc kubenswrapper[4786]: E1002 06:49:26.050068 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"555e6b6a02e283e17d785044506ec3828d2625c5569b27d0e630b89dc6e3b4b2\": container with ID starting with 555e6b6a02e283e17d785044506ec3828d2625c5569b27d0e630b89dc6e3b4b2 not found: ID does not exist" containerID="555e6b6a02e283e17d785044506ec3828d2625c5569b27d0e630b89dc6e3b4b2" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.050089 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"555e6b6a02e283e17d785044506ec3828d2625c5569b27d0e630b89dc6e3b4b2"} err="failed to get container status \"555e6b6a02e283e17d785044506ec3828d2625c5569b27d0e630b89dc6e3b4b2\": rpc error: code = NotFound desc = could not find container \"555e6b6a02e283e17d785044506ec3828d2625c5569b27d0e630b89dc6e3b4b2\": container with ID starting with 555e6b6a02e283e17d785044506ec3828d2625c5569b27d0e630b89dc6e3b4b2 not found: ID does not exist" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.050101 4786 scope.go:117] "RemoveContainer" containerID="c0a47247e51ecec08678e63ce31aaf75c0f3439365da3ac9de4a6dc8fa0712b2" Oct 02 06:49:26 crc kubenswrapper[4786]: E1002 06:49:26.050296 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0a47247e51ecec08678e63ce31aaf75c0f3439365da3ac9de4a6dc8fa0712b2\": container with ID starting with c0a47247e51ecec08678e63ce31aaf75c0f3439365da3ac9de4a6dc8fa0712b2 not found: ID does not exist" containerID="c0a47247e51ecec08678e63ce31aaf75c0f3439365da3ac9de4a6dc8fa0712b2" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.050316 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0a47247e51ecec08678e63ce31aaf75c0f3439365da3ac9de4a6dc8fa0712b2"} err="failed to get container status \"c0a47247e51ecec08678e63ce31aaf75c0f3439365da3ac9de4a6dc8fa0712b2\": rpc error: code = NotFound desc = could not find container \"c0a47247e51ecec08678e63ce31aaf75c0f3439365da3ac9de4a6dc8fa0712b2\": container with ID starting with c0a47247e51ecec08678e63ce31aaf75c0f3439365da3ac9de4a6dc8fa0712b2 not found: ID does not exist" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.198826 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e96c39a-5626-40ce-ad0f-f455c3292478" path="/var/lib/kubelet/pods/1e96c39a-5626-40ce-ad0f-f455c3292478/volumes" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.212475 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6nj4j" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.308184 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1638ccfb-4503-4402-bec5-e1f8a9588c38-catalog-content\") pod \"1638ccfb-4503-4402-bec5-e1f8a9588c38\" (UID: \"1638ccfb-4503-4402-bec5-e1f8a9588c38\") " Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.308245 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1638ccfb-4503-4402-bec5-e1f8a9588c38-utilities\") pod \"1638ccfb-4503-4402-bec5-e1f8a9588c38\" (UID: \"1638ccfb-4503-4402-bec5-e1f8a9588c38\") " Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.308843 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1638ccfb-4503-4402-bec5-e1f8a9588c38-utilities" (OuterVolumeSpecName: "utilities") pod "1638ccfb-4503-4402-bec5-e1f8a9588c38" (UID: "1638ccfb-4503-4402-bec5-e1f8a9588c38"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.339983 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1638ccfb-4503-4402-bec5-e1f8a9588c38-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1638ccfb-4503-4402-bec5-e1f8a9588c38" (UID: "1638ccfb-4503-4402-bec5-e1f8a9588c38"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.408581 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5zrc\" (UniqueName: \"kubernetes.io/projected/1638ccfb-4503-4402-bec5-e1f8a9588c38-kube-api-access-l5zrc\") pod \"1638ccfb-4503-4402-bec5-e1f8a9588c38\" (UID: \"1638ccfb-4503-4402-bec5-e1f8a9588c38\") " Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.408811 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1638ccfb-4503-4402-bec5-e1f8a9588c38-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.408828 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1638ccfb-4503-4402-bec5-e1f8a9588c38-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.422678 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1638ccfb-4503-4402-bec5-e1f8a9588c38-kube-api-access-l5zrc" (OuterVolumeSpecName: "kube-api-access-l5zrc") pod "1638ccfb-4503-4402-bec5-e1f8a9588c38" (UID: "1638ccfb-4503-4402-bec5-e1f8a9588c38"). InnerVolumeSpecName "kube-api-access-l5zrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.509758 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5zrc\" (UniqueName: \"kubernetes.io/projected/1638ccfb-4503-4402-bec5-e1f8a9588c38-kube-api-access-l5zrc\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.638167 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-dnz65" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.837230 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghfbw"] Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.837479 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ghfbw" podUID="787bb954-7ee0-4a9d-ba2b-9f8352adba43" containerName="registry-server" containerID="cri-o://cc783b0b470b99ccad4e8cd7c2f97c991ad08291cd9bc095b974a65655cae615" gracePeriod=2 Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.933544 4786 generic.go:334] "Generic (PLEG): container finished" podID="1638ccfb-4503-4402-bec5-e1f8a9588c38" containerID="30b88e747f012f732fc6078d8ef0a63a2874d99ac7b632650245fbcf458a6549" exitCode=0 Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.933643 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6nj4j" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.937243 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nj4j" event={"ID":"1638ccfb-4503-4402-bec5-e1f8a9588c38","Type":"ContainerDied","Data":"30b88e747f012f732fc6078d8ef0a63a2874d99ac7b632650245fbcf458a6549"} Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.937294 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nj4j" event={"ID":"1638ccfb-4503-4402-bec5-e1f8a9588c38","Type":"ContainerDied","Data":"06f08de8521fe8102e9beaf6145a970c441601e3fd7afc37dd8565e006ca8143"} Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.937319 4786 scope.go:117] "RemoveContainer" containerID="30b88e747f012f732fc6078d8ef0a63a2874d99ac7b632650245fbcf458a6549" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.953764 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6nj4j"] Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.955584 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6nj4j"] Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.955640 4786 scope.go:117] "RemoveContainer" containerID="c6ff6758968fa0fb8db35fdd2a0711782d290f08f2c9067aa688616546b4f368" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.977858 4786 scope.go:117] "RemoveContainer" containerID="084cc52cb555b4068d7495460f6a7366cf16f87c53f95c8820a058891b28daaa" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.989870 4786 scope.go:117] "RemoveContainer" containerID="30b88e747f012f732fc6078d8ef0a63a2874d99ac7b632650245fbcf458a6549" Oct 02 06:49:26 crc kubenswrapper[4786]: E1002 06:49:26.991316 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30b88e747f012f732fc6078d8ef0a63a2874d99ac7b632650245fbcf458a6549\": container with ID starting with 30b88e747f012f732fc6078d8ef0a63a2874d99ac7b632650245fbcf458a6549 not found: ID does not exist" containerID="30b88e747f012f732fc6078d8ef0a63a2874d99ac7b632650245fbcf458a6549" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.991343 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30b88e747f012f732fc6078d8ef0a63a2874d99ac7b632650245fbcf458a6549"} err="failed to get container status \"30b88e747f012f732fc6078d8ef0a63a2874d99ac7b632650245fbcf458a6549\": rpc error: code = NotFound desc = could not find container \"30b88e747f012f732fc6078d8ef0a63a2874d99ac7b632650245fbcf458a6549\": container with ID starting with 30b88e747f012f732fc6078d8ef0a63a2874d99ac7b632650245fbcf458a6549 not found: ID does not exist" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.991362 4786 scope.go:117] "RemoveContainer" containerID="c6ff6758968fa0fb8db35fdd2a0711782d290f08f2c9067aa688616546b4f368" Oct 02 06:49:26 crc kubenswrapper[4786]: E1002 06:49:26.991620 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6ff6758968fa0fb8db35fdd2a0711782d290f08f2c9067aa688616546b4f368\": container with ID starting with c6ff6758968fa0fb8db35fdd2a0711782d290f08f2c9067aa688616546b4f368 not found: ID does not exist" containerID="c6ff6758968fa0fb8db35fdd2a0711782d290f08f2c9067aa688616546b4f368" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.991654 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6ff6758968fa0fb8db35fdd2a0711782d290f08f2c9067aa688616546b4f368"} err="failed to get container status \"c6ff6758968fa0fb8db35fdd2a0711782d290f08f2c9067aa688616546b4f368\": rpc error: code = NotFound desc = could not find container \"c6ff6758968fa0fb8db35fdd2a0711782d290f08f2c9067aa688616546b4f368\": container with ID starting with c6ff6758968fa0fb8db35fdd2a0711782d290f08f2c9067aa688616546b4f368 not found: ID does not exist" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.991675 4786 scope.go:117] "RemoveContainer" containerID="084cc52cb555b4068d7495460f6a7366cf16f87c53f95c8820a058891b28daaa" Oct 02 06:49:26 crc kubenswrapper[4786]: E1002 06:49:26.991905 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"084cc52cb555b4068d7495460f6a7366cf16f87c53f95c8820a058891b28daaa\": container with ID starting with 084cc52cb555b4068d7495460f6a7366cf16f87c53f95c8820a058891b28daaa not found: ID does not exist" containerID="084cc52cb555b4068d7495460f6a7366cf16f87c53f95c8820a058891b28daaa" Oct 02 06:49:26 crc kubenswrapper[4786]: I1002 06:49:26.991930 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"084cc52cb555b4068d7495460f6a7366cf16f87c53f95c8820a058891b28daaa"} err="failed to get container status \"084cc52cb555b4068d7495460f6a7366cf16f87c53f95c8820a058891b28daaa\": rpc error: code = NotFound desc = could not find container \"084cc52cb555b4068d7495460f6a7366cf16f87c53f95c8820a058891b28daaa\": container with ID starting with 084cc52cb555b4068d7495460f6a7366cf16f87c53f95c8820a058891b28daaa not found: ID does not exist" Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.137081 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghfbw" Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.317223 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/787bb954-7ee0-4a9d-ba2b-9f8352adba43-catalog-content\") pod \"787bb954-7ee0-4a9d-ba2b-9f8352adba43\" (UID: \"787bb954-7ee0-4a9d-ba2b-9f8352adba43\") " Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.317290 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/787bb954-7ee0-4a9d-ba2b-9f8352adba43-utilities\") pod \"787bb954-7ee0-4a9d-ba2b-9f8352adba43\" (UID: \"787bb954-7ee0-4a9d-ba2b-9f8352adba43\") " Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.317404 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxn5p\" (UniqueName: \"kubernetes.io/projected/787bb954-7ee0-4a9d-ba2b-9f8352adba43-kube-api-access-fxn5p\") pod \"787bb954-7ee0-4a9d-ba2b-9f8352adba43\" (UID: \"787bb954-7ee0-4a9d-ba2b-9f8352adba43\") " Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.318051 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/787bb954-7ee0-4a9d-ba2b-9f8352adba43-utilities" (OuterVolumeSpecName: "utilities") pod "787bb954-7ee0-4a9d-ba2b-9f8352adba43" (UID: "787bb954-7ee0-4a9d-ba2b-9f8352adba43"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.326800 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/787bb954-7ee0-4a9d-ba2b-9f8352adba43-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "787bb954-7ee0-4a9d-ba2b-9f8352adba43" (UID: "787bb954-7ee0-4a9d-ba2b-9f8352adba43"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.331539 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/787bb954-7ee0-4a9d-ba2b-9f8352adba43-kube-api-access-fxn5p" (OuterVolumeSpecName: "kube-api-access-fxn5p") pod "787bb954-7ee0-4a9d-ba2b-9f8352adba43" (UID: "787bb954-7ee0-4a9d-ba2b-9f8352adba43"). InnerVolumeSpecName "kube-api-access-fxn5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.418862 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxn5p\" (UniqueName: \"kubernetes.io/projected/787bb954-7ee0-4a9d-ba2b-9f8352adba43-kube-api-access-fxn5p\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.418890 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/787bb954-7ee0-4a9d-ba2b-9f8352adba43-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.418899 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/787bb954-7ee0-4a9d-ba2b-9f8352adba43-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.497813 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.497877 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.945738 4786 generic.go:334] "Generic (PLEG): container finished" podID="787bb954-7ee0-4a9d-ba2b-9f8352adba43" containerID="cc783b0b470b99ccad4e8cd7c2f97c991ad08291cd9bc095b974a65655cae615" exitCode=0 Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.945795 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghfbw" event={"ID":"787bb954-7ee0-4a9d-ba2b-9f8352adba43","Type":"ContainerDied","Data":"cc783b0b470b99ccad4e8cd7c2f97c991ad08291cd9bc095b974a65655cae615"} Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.945813 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ghfbw" Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.945830 4786 scope.go:117] "RemoveContainer" containerID="cc783b0b470b99ccad4e8cd7c2f97c991ad08291cd9bc095b974a65655cae615" Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.945820 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ghfbw" event={"ID":"787bb954-7ee0-4a9d-ba2b-9f8352adba43","Type":"ContainerDied","Data":"7bd883e0aaacc83b7aa091210fc8c7a1d965af9430f485e809f826b4d14151c9"} Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.959875 4786 scope.go:117] "RemoveContainer" containerID="9a686115ec75de794c253d8339f16a7b270b84607059249dc6a3c884817f52de" Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.966546 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghfbw"] Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.968739 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ghfbw"] Oct 02 06:49:27 crc kubenswrapper[4786]: I1002 06:49:27.988630 4786 scope.go:117] "RemoveContainer" containerID="4c0ba30739ff5deb2bbb5ac195f669cc3a48ae09f5fba665df190e3f392b6d3c" Oct 02 06:49:28 crc kubenswrapper[4786]: I1002 06:49:28.010711 4786 scope.go:117] "RemoveContainer" containerID="cc783b0b470b99ccad4e8cd7c2f97c991ad08291cd9bc095b974a65655cae615" Oct 02 06:49:28 crc kubenswrapper[4786]: E1002 06:49:28.011029 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc783b0b470b99ccad4e8cd7c2f97c991ad08291cd9bc095b974a65655cae615\": container with ID starting with cc783b0b470b99ccad4e8cd7c2f97c991ad08291cd9bc095b974a65655cae615 not found: ID does not exist" containerID="cc783b0b470b99ccad4e8cd7c2f97c991ad08291cd9bc095b974a65655cae615" Oct 02 06:49:28 crc kubenswrapper[4786]: I1002 06:49:28.011054 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc783b0b470b99ccad4e8cd7c2f97c991ad08291cd9bc095b974a65655cae615"} err="failed to get container status \"cc783b0b470b99ccad4e8cd7c2f97c991ad08291cd9bc095b974a65655cae615\": rpc error: code = NotFound desc = could not find container \"cc783b0b470b99ccad4e8cd7c2f97c991ad08291cd9bc095b974a65655cae615\": container with ID starting with cc783b0b470b99ccad4e8cd7c2f97c991ad08291cd9bc095b974a65655cae615 not found: ID does not exist" Oct 02 06:49:28 crc kubenswrapper[4786]: I1002 06:49:28.011074 4786 scope.go:117] "RemoveContainer" containerID="9a686115ec75de794c253d8339f16a7b270b84607059249dc6a3c884817f52de" Oct 02 06:49:28 crc kubenswrapper[4786]: E1002 06:49:28.011355 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a686115ec75de794c253d8339f16a7b270b84607059249dc6a3c884817f52de\": container with ID starting with 9a686115ec75de794c253d8339f16a7b270b84607059249dc6a3c884817f52de not found: ID does not exist" containerID="9a686115ec75de794c253d8339f16a7b270b84607059249dc6a3c884817f52de" Oct 02 06:49:28 crc kubenswrapper[4786]: I1002 06:49:28.011384 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a686115ec75de794c253d8339f16a7b270b84607059249dc6a3c884817f52de"} err="failed to get container status \"9a686115ec75de794c253d8339f16a7b270b84607059249dc6a3c884817f52de\": rpc error: code = NotFound desc = could not find container \"9a686115ec75de794c253d8339f16a7b270b84607059249dc6a3c884817f52de\": container with ID starting with 9a686115ec75de794c253d8339f16a7b270b84607059249dc6a3c884817f52de not found: ID does not exist" Oct 02 06:49:28 crc kubenswrapper[4786]: I1002 06:49:28.011405 4786 scope.go:117] "RemoveContainer" containerID="4c0ba30739ff5deb2bbb5ac195f669cc3a48ae09f5fba665df190e3f392b6d3c" Oct 02 06:49:28 crc kubenswrapper[4786]: E1002 06:49:28.011710 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c0ba30739ff5deb2bbb5ac195f669cc3a48ae09f5fba665df190e3f392b6d3c\": container with ID starting with 4c0ba30739ff5deb2bbb5ac195f669cc3a48ae09f5fba665df190e3f392b6d3c not found: ID does not exist" containerID="4c0ba30739ff5deb2bbb5ac195f669cc3a48ae09f5fba665df190e3f392b6d3c" Oct 02 06:49:28 crc kubenswrapper[4786]: I1002 06:49:28.011729 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c0ba30739ff5deb2bbb5ac195f669cc3a48ae09f5fba665df190e3f392b6d3c"} err="failed to get container status \"4c0ba30739ff5deb2bbb5ac195f669cc3a48ae09f5fba665df190e3f392b6d3c\": rpc error: code = NotFound desc = could not find container \"4c0ba30739ff5deb2bbb5ac195f669cc3a48ae09f5fba665df190e3f392b6d3c\": container with ID starting with 4c0ba30739ff5deb2bbb5ac195f669cc3a48ae09f5fba665df190e3f392b6d3c not found: ID does not exist" Oct 02 06:49:28 crc kubenswrapper[4786]: I1002 06:49:28.192330 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1638ccfb-4503-4402-bec5-e1f8a9588c38" path="/var/lib/kubelet/pods/1638ccfb-4503-4402-bec5-e1f8a9588c38/volumes" Oct 02 06:49:28 crc kubenswrapper[4786]: I1002 06:49:28.192974 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="787bb954-7ee0-4a9d-ba2b-9f8352adba43" path="/var/lib/kubelet/pods/787bb954-7ee0-4a9d-ba2b-9f8352adba43/volumes" Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.237310 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4vp67"] Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.237519 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4vp67" podUID="5f3cfa4b-61d4-4b77-9dee-0a0595130ee8" containerName="registry-server" containerID="cri-o://9eb478c7f5003a65a4128efe57f8ae42f63c0b7d8747b464cd0b015d0e5f0d7f" gracePeriod=2 Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.533995 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4vp67" Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.649102 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f3cfa4b-61d4-4b77-9dee-0a0595130ee8-catalog-content\") pod \"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8\" (UID: \"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8\") " Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.649149 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h57qp\" (UniqueName: \"kubernetes.io/projected/5f3cfa4b-61d4-4b77-9dee-0a0595130ee8-kube-api-access-h57qp\") pod \"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8\" (UID: \"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8\") " Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.649207 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f3cfa4b-61d4-4b77-9dee-0a0595130ee8-utilities\") pod \"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8\" (UID: \"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8\") " Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.649916 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f3cfa4b-61d4-4b77-9dee-0a0595130ee8-utilities" (OuterVolumeSpecName: "utilities") pod "5f3cfa4b-61d4-4b77-9dee-0a0595130ee8" (UID: "5f3cfa4b-61d4-4b77-9dee-0a0595130ee8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.653057 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f3cfa4b-61d4-4b77-9dee-0a0595130ee8-kube-api-access-h57qp" (OuterVolumeSpecName: "kube-api-access-h57qp") pod "5f3cfa4b-61d4-4b77-9dee-0a0595130ee8" (UID: "5f3cfa4b-61d4-4b77-9dee-0a0595130ee8"). InnerVolumeSpecName "kube-api-access-h57qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.713719 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f3cfa4b-61d4-4b77-9dee-0a0595130ee8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f3cfa4b-61d4-4b77-9dee-0a0595130ee8" (UID: "5f3cfa4b-61d4-4b77-9dee-0a0595130ee8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.750449 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f3cfa4b-61d4-4b77-9dee-0a0595130ee8-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.750487 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f3cfa4b-61d4-4b77-9dee-0a0595130ee8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.750498 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h57qp\" (UniqueName: \"kubernetes.io/projected/5f3cfa4b-61d4-4b77-9dee-0a0595130ee8-kube-api-access-h57qp\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.959128 4786 generic.go:334] "Generic (PLEG): container finished" podID="5f3cfa4b-61d4-4b77-9dee-0a0595130ee8" containerID="9eb478c7f5003a65a4128efe57f8ae42f63c0b7d8747b464cd0b015d0e5f0d7f" exitCode=0 Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.959170 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4vp67" event={"ID":"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8","Type":"ContainerDied","Data":"9eb478c7f5003a65a4128efe57f8ae42f63c0b7d8747b464cd0b015d0e5f0d7f"} Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.959180 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4vp67" Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.959201 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4vp67" event={"ID":"5f3cfa4b-61d4-4b77-9dee-0a0595130ee8","Type":"ContainerDied","Data":"40da3ee535f97303d7e9777047c8b05f724b12c49070840895767f6ae02757b1"} Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.959220 4786 scope.go:117] "RemoveContainer" containerID="9eb478c7f5003a65a4128efe57f8ae42f63c0b7d8747b464cd0b015d0e5f0d7f" Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.973239 4786 scope.go:117] "RemoveContainer" containerID="a50321949b315a02e03ea1f223887e181c5df077a21668c2fb0ba383d897ad25" Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.982670 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4vp67"] Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.987150 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4vp67"] Oct 02 06:49:29 crc kubenswrapper[4786]: I1002 06:49:29.996755 4786 scope.go:117] "RemoveContainer" containerID="e47e4da2ae755ab86c1ea7479876460f2cca159eddbcae7cd61b25c576604567" Oct 02 06:49:30 crc kubenswrapper[4786]: I1002 06:49:30.006533 4786 scope.go:117] "RemoveContainer" containerID="9eb478c7f5003a65a4128efe57f8ae42f63c0b7d8747b464cd0b015d0e5f0d7f" Oct 02 06:49:30 crc kubenswrapper[4786]: E1002 06:49:30.006858 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eb478c7f5003a65a4128efe57f8ae42f63c0b7d8747b464cd0b015d0e5f0d7f\": container with ID starting with 9eb478c7f5003a65a4128efe57f8ae42f63c0b7d8747b464cd0b015d0e5f0d7f not found: ID does not exist" containerID="9eb478c7f5003a65a4128efe57f8ae42f63c0b7d8747b464cd0b015d0e5f0d7f" Oct 02 06:49:30 crc kubenswrapper[4786]: I1002 06:49:30.006884 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb478c7f5003a65a4128efe57f8ae42f63c0b7d8747b464cd0b015d0e5f0d7f"} err="failed to get container status \"9eb478c7f5003a65a4128efe57f8ae42f63c0b7d8747b464cd0b015d0e5f0d7f\": rpc error: code = NotFound desc = could not find container \"9eb478c7f5003a65a4128efe57f8ae42f63c0b7d8747b464cd0b015d0e5f0d7f\": container with ID starting with 9eb478c7f5003a65a4128efe57f8ae42f63c0b7d8747b464cd0b015d0e5f0d7f not found: ID does not exist" Oct 02 06:49:30 crc kubenswrapper[4786]: I1002 06:49:30.006901 4786 scope.go:117] "RemoveContainer" containerID="a50321949b315a02e03ea1f223887e181c5df077a21668c2fb0ba383d897ad25" Oct 02 06:49:30 crc kubenswrapper[4786]: E1002 06:49:30.007165 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a50321949b315a02e03ea1f223887e181c5df077a21668c2fb0ba383d897ad25\": container with ID starting with a50321949b315a02e03ea1f223887e181c5df077a21668c2fb0ba383d897ad25 not found: ID does not exist" containerID="a50321949b315a02e03ea1f223887e181c5df077a21668c2fb0ba383d897ad25" Oct 02 06:49:30 crc kubenswrapper[4786]: I1002 06:49:30.007186 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a50321949b315a02e03ea1f223887e181c5df077a21668c2fb0ba383d897ad25"} err="failed to get container status \"a50321949b315a02e03ea1f223887e181c5df077a21668c2fb0ba383d897ad25\": rpc error: code = NotFound desc = could not find container \"a50321949b315a02e03ea1f223887e181c5df077a21668c2fb0ba383d897ad25\": container with ID starting with a50321949b315a02e03ea1f223887e181c5df077a21668c2fb0ba383d897ad25 not found: ID does not exist" Oct 02 06:49:30 crc kubenswrapper[4786]: I1002 06:49:30.007198 4786 scope.go:117] "RemoveContainer" containerID="e47e4da2ae755ab86c1ea7479876460f2cca159eddbcae7cd61b25c576604567" Oct 02 06:49:30 crc kubenswrapper[4786]: E1002 06:49:30.007568 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e47e4da2ae755ab86c1ea7479876460f2cca159eddbcae7cd61b25c576604567\": container with ID starting with e47e4da2ae755ab86c1ea7479876460f2cca159eddbcae7cd61b25c576604567 not found: ID does not exist" containerID="e47e4da2ae755ab86c1ea7479876460f2cca159eddbcae7cd61b25c576604567" Oct 02 06:49:30 crc kubenswrapper[4786]: I1002 06:49:30.007590 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e47e4da2ae755ab86c1ea7479876460f2cca159eddbcae7cd61b25c576604567"} err="failed to get container status \"e47e4da2ae755ab86c1ea7479876460f2cca159eddbcae7cd61b25c576604567\": rpc error: code = NotFound desc = could not find container \"e47e4da2ae755ab86c1ea7479876460f2cca159eddbcae7cd61b25c576604567\": container with ID starting with e47e4da2ae755ab86c1ea7479876460f2cca159eddbcae7cd61b25c576604567 not found: ID does not exist" Oct 02 06:49:30 crc kubenswrapper[4786]: I1002 06:49:30.187125 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f3cfa4b-61d4-4b77-9dee-0a0595130ee8" path="/var/lib/kubelet/pods/5f3cfa4b-61d4-4b77-9dee-0a0595130ee8/volumes" Oct 02 06:49:36 crc kubenswrapper[4786]: I1002 06:49:36.308857 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 06:49:48 crc kubenswrapper[4786]: I1002 06:49:48.982291 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" podUID="449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" containerName="oauth-openshift" containerID="cri-o://6d888c4431c1676477f23a4ea172d4126affc4b0b4246afe5ba488fb4d4aa27b" gracePeriod=15 Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.252849 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.270551 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-666864db7b-zzphm"] Oct 02 06:49:49 crc kubenswrapper[4786]: E1002 06:49:49.270740 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9dbf55c-ce44-4183-a1ed-b2075bbe49a9" containerName="pruner" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.270751 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9dbf55c-ce44-4183-a1ed-b2075bbe49a9" containerName="pruner" Oct 02 06:49:49 crc kubenswrapper[4786]: E1002 06:49:49.270761 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="787bb954-7ee0-4a9d-ba2b-9f8352adba43" containerName="extract-content" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.270766 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="787bb954-7ee0-4a9d-ba2b-9f8352adba43" containerName="extract-content" Oct 02 06:49:49 crc kubenswrapper[4786]: E1002 06:49:49.270776 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1638ccfb-4503-4402-bec5-e1f8a9588c38" containerName="registry-server" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.270782 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1638ccfb-4503-4402-bec5-e1f8a9588c38" containerName="registry-server" Oct 02 06:49:49 crc kubenswrapper[4786]: E1002 06:49:49.270790 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="787bb954-7ee0-4a9d-ba2b-9f8352adba43" containerName="registry-server" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.270795 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="787bb954-7ee0-4a9d-ba2b-9f8352adba43" containerName="registry-server" Oct 02 06:49:49 crc kubenswrapper[4786]: E1002 06:49:49.270804 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4880b62-d5a5-402a-b285-f6c0ca7e3f45" containerName="pruner" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.270810 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4880b62-d5a5-402a-b285-f6c0ca7e3f45" containerName="pruner" Oct 02 06:49:49 crc kubenswrapper[4786]: E1002 06:49:49.270817 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e96c39a-5626-40ce-ad0f-f455c3292478" containerName="registry-server" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.270822 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e96c39a-5626-40ce-ad0f-f455c3292478" containerName="registry-server" Oct 02 06:49:49 crc kubenswrapper[4786]: E1002 06:49:49.270827 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f3cfa4b-61d4-4b77-9dee-0a0595130ee8" containerName="extract-utilities" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.270832 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f3cfa4b-61d4-4b77-9dee-0a0595130ee8" containerName="extract-utilities" Oct 02 06:49:49 crc kubenswrapper[4786]: E1002 06:49:49.270838 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="787bb954-7ee0-4a9d-ba2b-9f8352adba43" containerName="extract-utilities" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.270844 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="787bb954-7ee0-4a9d-ba2b-9f8352adba43" containerName="extract-utilities" Oct 02 06:49:49 crc kubenswrapper[4786]: E1002 06:49:49.270850 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1638ccfb-4503-4402-bec5-e1f8a9588c38" containerName="extract-utilities" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.270855 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1638ccfb-4503-4402-bec5-e1f8a9588c38" containerName="extract-utilities" Oct 02 06:49:49 crc kubenswrapper[4786]: E1002 06:49:49.270863 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f3cfa4b-61d4-4b77-9dee-0a0595130ee8" containerName="extract-content" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.270868 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f3cfa4b-61d4-4b77-9dee-0a0595130ee8" containerName="extract-content" Oct 02 06:49:49 crc kubenswrapper[4786]: E1002 06:49:49.270874 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e96c39a-5626-40ce-ad0f-f455c3292478" containerName="extract-content" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.270879 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e96c39a-5626-40ce-ad0f-f455c3292478" containerName="extract-content" Oct 02 06:49:49 crc kubenswrapper[4786]: E1002 06:49:49.270885 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" containerName="oauth-openshift" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.270890 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" containerName="oauth-openshift" Oct 02 06:49:49 crc kubenswrapper[4786]: E1002 06:49:49.270899 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1638ccfb-4503-4402-bec5-e1f8a9588c38" containerName="extract-content" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.270904 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1638ccfb-4503-4402-bec5-e1f8a9588c38" containerName="extract-content" Oct 02 06:49:49 crc kubenswrapper[4786]: E1002 06:49:49.270912 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e96c39a-5626-40ce-ad0f-f455c3292478" containerName="extract-utilities" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.270918 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e96c39a-5626-40ce-ad0f-f455c3292478" containerName="extract-utilities" Oct 02 06:49:49 crc kubenswrapper[4786]: E1002 06:49:49.270924 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f3cfa4b-61d4-4b77-9dee-0a0595130ee8" containerName="registry-server" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.270929 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f3cfa4b-61d4-4b77-9dee-0a0595130ee8" containerName="registry-server" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.271002 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f3cfa4b-61d4-4b77-9dee-0a0595130ee8" containerName="registry-server" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.271009 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1638ccfb-4503-4402-bec5-e1f8a9588c38" containerName="registry-server" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.271018 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" containerName="oauth-openshift" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.271026 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e96c39a-5626-40ce-ad0f-f455c3292478" containerName="registry-server" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.271034 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4880b62-d5a5-402a-b285-f6c0ca7e3f45" containerName="pruner" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.271041 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9dbf55c-ce44-4183-a1ed-b2075bbe49a9" containerName="pruner" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.271049 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="787bb954-7ee0-4a9d-ba2b-9f8352adba43" containerName="registry-server" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.271328 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.287426 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-666864db7b-zzphm"] Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432214 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-service-ca\") pod \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432253 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-template-error\") pod \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432277 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jfdj\" (UniqueName: \"kubernetes.io/projected/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-kube-api-access-9jfdj\") pod \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432306 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-session\") pod \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432326 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-audit-policies\") pod \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432345 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-template-login\") pod \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432373 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-serving-cert\") pod \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432407 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-audit-dir\") pod \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432442 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-idp-0-file-data\") pod \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432464 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-trusted-ca-bundle\") pod \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432493 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-cliconfig\") pod \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432519 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-ocp-branding-template\") pod \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432531 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" (UID: "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432547 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-template-provider-selection\") pod \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432566 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-router-certs\") pod \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\" (UID: \"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb\") " Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432672 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-session\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432732 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432759 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432778 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1251e4f6-d4c1-4996-ba96-32db14815f1e-audit-dir\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432796 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-router-certs\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432813 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1251e4f6-d4c1-4996-ba96-32db14815f1e-audit-policies\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432835 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432855 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432901 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7dts\" (UniqueName: \"kubernetes.io/projected/1251e4f6-d4c1-4996-ba96-32db14815f1e-kube-api-access-v7dts\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432920 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-service-ca\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432950 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432976 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432988 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" (UID: "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.432995 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-user-template-login\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.433019 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" (UID: "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.433315 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-user-template-error\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.433365 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" (UID: "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.433497 4786 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.433518 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.433530 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.433540 4786 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.433576 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" (UID: "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.437826 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" (UID: "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.438217 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" (UID: "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.438285 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-kube-api-access-9jfdj" (OuterVolumeSpecName: "kube-api-access-9jfdj") pod "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" (UID: "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb"). InnerVolumeSpecName "kube-api-access-9jfdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.438607 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" (UID: "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.438882 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" (UID: "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.438987 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" (UID: "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.439068 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" (UID: "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.439190 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" (UID: "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.439332 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" (UID: "449bc1d6-8b9d-4318-8bfb-02b2a94a23fb"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534127 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534187 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534211 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-user-template-login\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534226 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-user-template-error\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534261 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-session\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534283 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534302 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534322 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1251e4f6-d4c1-4996-ba96-32db14815f1e-audit-dir\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534337 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-router-certs\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534351 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1251e4f6-d4c1-4996-ba96-32db14815f1e-audit-policies\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534372 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534388 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534422 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7dts\" (UniqueName: \"kubernetes.io/projected/1251e4f6-d4c1-4996-ba96-32db14815f1e-kube-api-access-v7dts\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534437 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-service-ca\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534490 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534500 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jfdj\" (UniqueName: \"kubernetes.io/projected/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-kube-api-access-9jfdj\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534509 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534517 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534526 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534535 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534544 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534554 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534562 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534571 4786 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534761 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1251e4f6-d4c1-4996-ba96-32db14815f1e-audit-dir\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.534919 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.535114 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.535358 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-service-ca\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.535605 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1251e4f6-d4c1-4996-ba96-32db14815f1e-audit-policies\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.537521 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.537579 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.537904 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.537931 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-user-template-login\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.537951 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.538147 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-user-template-error\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.538252 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-router-certs\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.538427 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1251e4f6-d4c1-4996-ba96-32db14815f1e-v4-0-config-system-session\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.547538 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7dts\" (UniqueName: \"kubernetes.io/projected/1251e4f6-d4c1-4996-ba96-32db14815f1e-kube-api-access-v7dts\") pod \"oauth-openshift-666864db7b-zzphm\" (UID: \"1251e4f6-d4c1-4996-ba96-32db14815f1e\") " pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.581732 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:49 crc kubenswrapper[4786]: I1002 06:49:49.705092 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-666864db7b-zzphm"] Oct 02 06:49:50 crc kubenswrapper[4786]: I1002 06:49:50.029775 4786 generic.go:334] "Generic (PLEG): container finished" podID="449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" containerID="6d888c4431c1676477f23a4ea172d4126affc4b0b4246afe5ba488fb4d4aa27b" exitCode=0 Oct 02 06:49:50 crc kubenswrapper[4786]: I1002 06:49:50.029817 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" event={"ID":"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb","Type":"ContainerDied","Data":"6d888c4431c1676477f23a4ea172d4126affc4b0b4246afe5ba488fb4d4aa27b"} Oct 02 06:49:50 crc kubenswrapper[4786]: I1002 06:49:50.030021 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" event={"ID":"449bc1d6-8b9d-4318-8bfb-02b2a94a23fb","Type":"ContainerDied","Data":"3a443c42b61aa6f7a0691a4d32cc78eeb3f58f9e45bf08b5dc374e328afcce2b"} Oct 02 06:49:50 crc kubenswrapper[4786]: I1002 06:49:50.029856 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-txfb6" Oct 02 06:49:50 crc kubenswrapper[4786]: I1002 06:49:50.030039 4786 scope.go:117] "RemoveContainer" containerID="6d888c4431c1676477f23a4ea172d4126affc4b0b4246afe5ba488fb4d4aa27b" Oct 02 06:49:50 crc kubenswrapper[4786]: I1002 06:49:50.032257 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" event={"ID":"1251e4f6-d4c1-4996-ba96-32db14815f1e","Type":"ContainerStarted","Data":"a2bec31fe5a99a45e8ba189c9f8e71b46fdc6f9769fc4ecb01a76f69c61f3ea5"} Oct 02 06:49:50 crc kubenswrapper[4786]: I1002 06:49:50.032295 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" event={"ID":"1251e4f6-d4c1-4996-ba96-32db14815f1e","Type":"ContainerStarted","Data":"13134c6cea44f58dab887619114c7e8bfd27dfcaa32a20682b05d51b0821ba60"} Oct 02 06:49:50 crc kubenswrapper[4786]: I1002 06:49:50.032394 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:50 crc kubenswrapper[4786]: I1002 06:49:50.044765 4786 scope.go:117] "RemoveContainer" containerID="6d888c4431c1676477f23a4ea172d4126affc4b0b4246afe5ba488fb4d4aa27b" Oct 02 06:49:50 crc kubenswrapper[4786]: E1002 06:49:50.045162 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d888c4431c1676477f23a4ea172d4126affc4b0b4246afe5ba488fb4d4aa27b\": container with ID starting with 6d888c4431c1676477f23a4ea172d4126affc4b0b4246afe5ba488fb4d4aa27b not found: ID does not exist" containerID="6d888c4431c1676477f23a4ea172d4126affc4b0b4246afe5ba488fb4d4aa27b" Oct 02 06:49:50 crc kubenswrapper[4786]: I1002 06:49:50.045189 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d888c4431c1676477f23a4ea172d4126affc4b0b4246afe5ba488fb4d4aa27b"} err="failed to get container status \"6d888c4431c1676477f23a4ea172d4126affc4b0b4246afe5ba488fb4d4aa27b\": rpc error: code = NotFound desc = could not find container \"6d888c4431c1676477f23a4ea172d4126affc4b0b4246afe5ba488fb4d4aa27b\": container with ID starting with 6d888c4431c1676477f23a4ea172d4126affc4b0b4246afe5ba488fb4d4aa27b not found: ID does not exist" Oct 02 06:49:50 crc kubenswrapper[4786]: I1002 06:49:50.047743 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" podStartSLOduration=27.047733353 podStartE2EDuration="27.047733353s" podCreationTimestamp="2025-10-02 06:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:49:50.046113037 +0000 UTC m=+200.167296178" watchObservedRunningTime="2025-10-02 06:49:50.047733353 +0000 UTC m=+200.168916483" Oct 02 06:49:50 crc kubenswrapper[4786]: I1002 06:49:50.058475 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-txfb6"] Oct 02 06:49:50 crc kubenswrapper[4786]: I1002 06:49:50.060591 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-txfb6"] Oct 02 06:49:50 crc kubenswrapper[4786]: I1002 06:49:50.076908 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-666864db7b-zzphm" Oct 02 06:49:50 crc kubenswrapper[4786]: I1002 06:49:50.183448 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="449bc1d6-8b9d-4318-8bfb-02b2a94a23fb" path="/var/lib/kubelet/pods/449bc1d6-8b9d-4318-8bfb-02b2a94a23fb/volumes" Oct 02 06:49:57 crc kubenswrapper[4786]: I1002 06:49:57.497246 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 06:49:57 crc kubenswrapper[4786]: I1002 06:49:57.497522 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 06:49:57 crc kubenswrapper[4786]: I1002 06:49:57.497554 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 06:49:57 crc kubenswrapper[4786]: I1002 06:49:57.497916 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e"} pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 06:49:57 crc kubenswrapper[4786]: I1002 06:49:57.497960 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" containerID="cri-o://41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e" gracePeriod=600 Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.063637 4786 generic.go:334] "Generic (PLEG): container finished" podID="79cb22df-4930-4aed-9108-1056074d1000" containerID="41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e" exitCode=0 Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.063720 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerDied","Data":"41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e"} Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.063921 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerStarted","Data":"2c64abc9152933569ee60a2038ba082fca146fe30b68dc264add9f4e59c75ef2"} Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.308565 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c9mns"] Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.308767 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c9mns" podUID="074fa372-0ddd-47ea-a3ad-9203d7574875" containerName="registry-server" containerID="cri-o://904400a2901e0309e54f5aad36d44946706b89e0fc39c2118affb96fd53457ec" gracePeriod=30 Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.315248 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fkn49"] Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.315449 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fkn49" podUID="0929cab2-9dd2-42c2-a91f-1b98bf72ace3" containerName="registry-server" containerID="cri-o://4b5b8b957b81f0ec9bae35c2e93f4d0bb5533c11a74e67fc492da68fb8436ba9" gracePeriod=30 Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.323975 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-swzbl"] Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.324152 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" podUID="6f80061e-e327-432d-a5dd-e0e671298e44" containerName="marketplace-operator" containerID="cri-o://ed6d2ae866b79bb8579b4505203bd0ed396f58fd0bf166136c49de8f8a958d79" gracePeriod=30 Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.330443 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tbkbh"] Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.330596 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tbkbh" podUID="dfdbf275-781b-4a7d-b943-b592d682d11a" containerName="registry-server" containerID="cri-o://63cc0f8857c2d29511339f1c7b029a35cf3683f8df0ecab73566b9851bad0478" gracePeriod=30 Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.334068 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tq2t2"] Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.334568 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tq2t2" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.338728 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fskng"] Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.338909 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fskng" podUID="68893526-5b68-42b3-8711-a11fed5996e7" containerName="registry-server" containerID="cri-o://ab380e677f2d8e6d2ab374d2847c782acf84eb5b86f73694dec03c35320ad369" gracePeriod=30 Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.348901 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tq2t2"] Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.523180 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53de3bf0-46ae-4969-a69e-2ad45e207407-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tq2t2\" (UID: \"53de3bf0-46ae-4969-a69e-2ad45e207407\") " pod="openshift-marketplace/marketplace-operator-79b997595-tq2t2" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.523556 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rshfm\" (UniqueName: \"kubernetes.io/projected/53de3bf0-46ae-4969-a69e-2ad45e207407-kube-api-access-rshfm\") pod \"marketplace-operator-79b997595-tq2t2\" (UID: \"53de3bf0-46ae-4969-a69e-2ad45e207407\") " pod="openshift-marketplace/marketplace-operator-79b997595-tq2t2" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.523594 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/53de3bf0-46ae-4969-a69e-2ad45e207407-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tq2t2\" (UID: \"53de3bf0-46ae-4969-a69e-2ad45e207407\") " pod="openshift-marketplace/marketplace-operator-79b997595-tq2t2" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.624940 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53de3bf0-46ae-4969-a69e-2ad45e207407-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tq2t2\" (UID: \"53de3bf0-46ae-4969-a69e-2ad45e207407\") " pod="openshift-marketplace/marketplace-operator-79b997595-tq2t2" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.625070 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rshfm\" (UniqueName: \"kubernetes.io/projected/53de3bf0-46ae-4969-a69e-2ad45e207407-kube-api-access-rshfm\") pod \"marketplace-operator-79b997595-tq2t2\" (UID: \"53de3bf0-46ae-4969-a69e-2ad45e207407\") " pod="openshift-marketplace/marketplace-operator-79b997595-tq2t2" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.625096 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/53de3bf0-46ae-4969-a69e-2ad45e207407-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tq2t2\" (UID: \"53de3bf0-46ae-4969-a69e-2ad45e207407\") " pod="openshift-marketplace/marketplace-operator-79b997595-tq2t2" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.626648 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53de3bf0-46ae-4969-a69e-2ad45e207407-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tq2t2\" (UID: \"53de3bf0-46ae-4969-a69e-2ad45e207407\") " pod="openshift-marketplace/marketplace-operator-79b997595-tq2t2" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.634395 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/53de3bf0-46ae-4969-a69e-2ad45e207407-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tq2t2\" (UID: \"53de3bf0-46ae-4969-a69e-2ad45e207407\") " pod="openshift-marketplace/marketplace-operator-79b997595-tq2t2" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.642068 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rshfm\" (UniqueName: \"kubernetes.io/projected/53de3bf0-46ae-4969-a69e-2ad45e207407-kube-api-access-rshfm\") pod \"marketplace-operator-79b997595-tq2t2\" (UID: \"53de3bf0-46ae-4969-a69e-2ad45e207407\") " pod="openshift-marketplace/marketplace-operator-79b997595-tq2t2" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.688749 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tq2t2" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.691671 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c9mns" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.699243 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fkn49" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.700785 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fskng" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.729294 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0929cab2-9dd2-42c2-a91f-1b98bf72ace3-catalog-content\") pod \"0929cab2-9dd2-42c2-a91f-1b98bf72ace3\" (UID: \"0929cab2-9dd2-42c2-a91f-1b98bf72ace3\") " Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.729329 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68893526-5b68-42b3-8711-a11fed5996e7-catalog-content\") pod \"68893526-5b68-42b3-8711-a11fed5996e7\" (UID: \"68893526-5b68-42b3-8711-a11fed5996e7\") " Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.729355 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0929cab2-9dd2-42c2-a91f-1b98bf72ace3-utilities\") pod \"0929cab2-9dd2-42c2-a91f-1b98bf72ace3\" (UID: \"0929cab2-9dd2-42c2-a91f-1b98bf72ace3\") " Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.729379 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbsv6\" (UniqueName: \"kubernetes.io/projected/68893526-5b68-42b3-8711-a11fed5996e7-kube-api-access-kbsv6\") pod \"68893526-5b68-42b3-8711-a11fed5996e7\" (UID: \"68893526-5b68-42b3-8711-a11fed5996e7\") " Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.729404 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4bm5\" (UniqueName: \"kubernetes.io/projected/0929cab2-9dd2-42c2-a91f-1b98bf72ace3-kube-api-access-t4bm5\") pod \"0929cab2-9dd2-42c2-a91f-1b98bf72ace3\" (UID: \"0929cab2-9dd2-42c2-a91f-1b98bf72ace3\") " Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.729422 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074fa372-0ddd-47ea-a3ad-9203d7574875-utilities\") pod \"074fa372-0ddd-47ea-a3ad-9203d7574875\" (UID: \"074fa372-0ddd-47ea-a3ad-9203d7574875\") " Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.729444 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncpmx\" (UniqueName: \"kubernetes.io/projected/074fa372-0ddd-47ea-a3ad-9203d7574875-kube-api-access-ncpmx\") pod \"074fa372-0ddd-47ea-a3ad-9203d7574875\" (UID: \"074fa372-0ddd-47ea-a3ad-9203d7574875\") " Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.729470 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074fa372-0ddd-47ea-a3ad-9203d7574875-catalog-content\") pod \"074fa372-0ddd-47ea-a3ad-9203d7574875\" (UID: \"074fa372-0ddd-47ea-a3ad-9203d7574875\") " Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.729488 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68893526-5b68-42b3-8711-a11fed5996e7-utilities\") pod \"68893526-5b68-42b3-8711-a11fed5996e7\" (UID: \"68893526-5b68-42b3-8711-a11fed5996e7\") " Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.731546 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0929cab2-9dd2-42c2-a91f-1b98bf72ace3-utilities" (OuterVolumeSpecName: "utilities") pod "0929cab2-9dd2-42c2-a91f-1b98bf72ace3" (UID: "0929cab2-9dd2-42c2-a91f-1b98bf72ace3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.731779 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68893526-5b68-42b3-8711-a11fed5996e7-utilities" (OuterVolumeSpecName: "utilities") pod "68893526-5b68-42b3-8711-a11fed5996e7" (UID: "68893526-5b68-42b3-8711-a11fed5996e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.732312 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/074fa372-0ddd-47ea-a3ad-9203d7574875-utilities" (OuterVolumeSpecName: "utilities") pod "074fa372-0ddd-47ea-a3ad-9203d7574875" (UID: "074fa372-0ddd-47ea-a3ad-9203d7574875"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.733509 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68893526-5b68-42b3-8711-a11fed5996e7-kube-api-access-kbsv6" (OuterVolumeSpecName: "kube-api-access-kbsv6") pod "68893526-5b68-42b3-8711-a11fed5996e7" (UID: "68893526-5b68-42b3-8711-a11fed5996e7"). InnerVolumeSpecName "kube-api-access-kbsv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.733972 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/074fa372-0ddd-47ea-a3ad-9203d7574875-kube-api-access-ncpmx" (OuterVolumeSpecName: "kube-api-access-ncpmx") pod "074fa372-0ddd-47ea-a3ad-9203d7574875" (UID: "074fa372-0ddd-47ea-a3ad-9203d7574875"). InnerVolumeSpecName "kube-api-access-ncpmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.738122 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0929cab2-9dd2-42c2-a91f-1b98bf72ace3-kube-api-access-t4bm5" (OuterVolumeSpecName: "kube-api-access-t4bm5") pod "0929cab2-9dd2-42c2-a91f-1b98bf72ace3" (UID: "0929cab2-9dd2-42c2-a91f-1b98bf72ace3"). InnerVolumeSpecName "kube-api-access-t4bm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.740725 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.761845 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tbkbh" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.781065 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/074fa372-0ddd-47ea-a3ad-9203d7574875-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "074fa372-0ddd-47ea-a3ad-9203d7574875" (UID: "074fa372-0ddd-47ea-a3ad-9203d7574875"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.789016 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0929cab2-9dd2-42c2-a91f-1b98bf72ace3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0929cab2-9dd2-42c2-a91f-1b98bf72ace3" (UID: "0929cab2-9dd2-42c2-a91f-1b98bf72ace3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.816142 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68893526-5b68-42b3-8711-a11fed5996e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68893526-5b68-42b3-8711-a11fed5996e7" (UID: "68893526-5b68-42b3-8711-a11fed5996e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.830292 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfdbf275-781b-4a7d-b943-b592d682d11a-catalog-content\") pod \"dfdbf275-781b-4a7d-b943-b592d682d11a\" (UID: \"dfdbf275-781b-4a7d-b943-b592d682d11a\") " Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.830340 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f80061e-e327-432d-a5dd-e0e671298e44-marketplace-operator-metrics\") pod \"6f80061e-e327-432d-a5dd-e0e671298e44\" (UID: \"6f80061e-e327-432d-a5dd-e0e671298e44\") " Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.830401 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f80061e-e327-432d-a5dd-e0e671298e44-marketplace-trusted-ca\") pod \"6f80061e-e327-432d-a5dd-e0e671298e44\" (UID: \"6f80061e-e327-432d-a5dd-e0e671298e44\") " Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.830426 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rn72\" (UniqueName: \"kubernetes.io/projected/dfdbf275-781b-4a7d-b943-b592d682d11a-kube-api-access-4rn72\") pod \"dfdbf275-781b-4a7d-b943-b592d682d11a\" (UID: \"dfdbf275-781b-4a7d-b943-b592d682d11a\") " Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.830448 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfdbf275-781b-4a7d-b943-b592d682d11a-utilities\") pod \"dfdbf275-781b-4a7d-b943-b592d682d11a\" (UID: \"dfdbf275-781b-4a7d-b943-b592d682d11a\") " Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.830463 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp9jl\" (UniqueName: \"kubernetes.io/projected/6f80061e-e327-432d-a5dd-e0e671298e44-kube-api-access-hp9jl\") pod \"6f80061e-e327-432d-a5dd-e0e671298e44\" (UID: \"6f80061e-e327-432d-a5dd-e0e671298e44\") " Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.830610 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68893526-5b68-42b3-8711-a11fed5996e7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.830629 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0929cab2-9dd2-42c2-a91f-1b98bf72ace3-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.830638 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbsv6\" (UniqueName: \"kubernetes.io/projected/68893526-5b68-42b3-8711-a11fed5996e7-kube-api-access-kbsv6\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.830649 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4bm5\" (UniqueName: \"kubernetes.io/projected/0929cab2-9dd2-42c2-a91f-1b98bf72ace3-kube-api-access-t4bm5\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.830656 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074fa372-0ddd-47ea-a3ad-9203d7574875-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.830665 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncpmx\" (UniqueName: \"kubernetes.io/projected/074fa372-0ddd-47ea-a3ad-9203d7574875-kube-api-access-ncpmx\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.830673 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074fa372-0ddd-47ea-a3ad-9203d7574875-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.830681 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68893526-5b68-42b3-8711-a11fed5996e7-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.830702 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0929cab2-9dd2-42c2-a91f-1b98bf72ace3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.831655 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfdbf275-781b-4a7d-b943-b592d682d11a-utilities" (OuterVolumeSpecName: "utilities") pod "dfdbf275-781b-4a7d-b943-b592d682d11a" (UID: "dfdbf275-781b-4a7d-b943-b592d682d11a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.831881 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f80061e-e327-432d-a5dd-e0e671298e44-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6f80061e-e327-432d-a5dd-e0e671298e44" (UID: "6f80061e-e327-432d-a5dd-e0e671298e44"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.833516 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfdbf275-781b-4a7d-b943-b592d682d11a-kube-api-access-4rn72" (OuterVolumeSpecName: "kube-api-access-4rn72") pod "dfdbf275-781b-4a7d-b943-b592d682d11a" (UID: "dfdbf275-781b-4a7d-b943-b592d682d11a"). InnerVolumeSpecName "kube-api-access-4rn72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.833588 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f80061e-e327-432d-a5dd-e0e671298e44-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6f80061e-e327-432d-a5dd-e0e671298e44" (UID: "6f80061e-e327-432d-a5dd-e0e671298e44"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.834221 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f80061e-e327-432d-a5dd-e0e671298e44-kube-api-access-hp9jl" (OuterVolumeSpecName: "kube-api-access-hp9jl") pod "6f80061e-e327-432d-a5dd-e0e671298e44" (UID: "6f80061e-e327-432d-a5dd-e0e671298e44"). InnerVolumeSpecName "kube-api-access-hp9jl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.840538 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfdbf275-781b-4a7d-b943-b592d682d11a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dfdbf275-781b-4a7d-b943-b592d682d11a" (UID: "dfdbf275-781b-4a7d-b943-b592d682d11a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.931596 4786 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6f80061e-e327-432d-a5dd-e0e671298e44-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.931630 4786 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f80061e-e327-432d-a5dd-e0e671298e44-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.931640 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rn72\" (UniqueName: \"kubernetes.io/projected/dfdbf275-781b-4a7d-b943-b592d682d11a-kube-api-access-4rn72\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.931650 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfdbf275-781b-4a7d-b943-b592d682d11a-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.931659 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp9jl\" (UniqueName: \"kubernetes.io/projected/6f80061e-e327-432d-a5dd-e0e671298e44-kube-api-access-hp9jl\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:58 crc kubenswrapper[4786]: I1002 06:49:58.931680 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfdbf275-781b-4a7d-b943-b592d682d11a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.062720 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tq2t2"] Oct 02 06:49:59 crc kubenswrapper[4786]: W1002 06:49:59.065602 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53de3bf0_46ae_4969_a69e_2ad45e207407.slice/crio-eccaa23974c3be094e8c278712c5a5db07532bc7d07de37d09071c9a416890c1 WatchSource:0}: Error finding container eccaa23974c3be094e8c278712c5a5db07532bc7d07de37d09071c9a416890c1: Status 404 returned error can't find the container with id eccaa23974c3be094e8c278712c5a5db07532bc7d07de37d09071c9a416890c1 Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.073543 4786 generic.go:334] "Generic (PLEG): container finished" podID="dfdbf275-781b-4a7d-b943-b592d682d11a" containerID="63cc0f8857c2d29511339f1c7b029a35cf3683f8df0ecab73566b9851bad0478" exitCode=0 Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.073682 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbkbh" event={"ID":"dfdbf275-781b-4a7d-b943-b592d682d11a","Type":"ContainerDied","Data":"63cc0f8857c2d29511339f1c7b029a35cf3683f8df0ecab73566b9851bad0478"} Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.073726 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbkbh" event={"ID":"dfdbf275-781b-4a7d-b943-b592d682d11a","Type":"ContainerDied","Data":"34d54367b6d6bc6fa99262e5b9962a68da4bc1e49c6f6d355ce81c0d9da1577c"} Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.073743 4786 scope.go:117] "RemoveContainer" containerID="63cc0f8857c2d29511339f1c7b029a35cf3683f8df0ecab73566b9851bad0478" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.073775 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tbkbh" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.076714 4786 generic.go:334] "Generic (PLEG): container finished" podID="6f80061e-e327-432d-a5dd-e0e671298e44" containerID="ed6d2ae866b79bb8579b4505203bd0ed396f58fd0bf166136c49de8f8a958d79" exitCode=0 Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.076756 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" event={"ID":"6f80061e-e327-432d-a5dd-e0e671298e44","Type":"ContainerDied","Data":"ed6d2ae866b79bb8579b4505203bd0ed396f58fd0bf166136c49de8f8a958d79"} Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.076772 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" event={"ID":"6f80061e-e327-432d-a5dd-e0e671298e44","Type":"ContainerDied","Data":"eb56807511c4967673c9a7bc882f4c23dcf0fe668de15a2932d902cce559d862"} Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.076810 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-swzbl" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.079124 4786 generic.go:334] "Generic (PLEG): container finished" podID="68893526-5b68-42b3-8711-a11fed5996e7" containerID="ab380e677f2d8e6d2ab374d2847c782acf84eb5b86f73694dec03c35320ad369" exitCode=0 Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.079175 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fskng" event={"ID":"68893526-5b68-42b3-8711-a11fed5996e7","Type":"ContainerDied","Data":"ab380e677f2d8e6d2ab374d2847c782acf84eb5b86f73694dec03c35320ad369"} Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.079180 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fskng" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.079193 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fskng" event={"ID":"68893526-5b68-42b3-8711-a11fed5996e7","Type":"ContainerDied","Data":"567fd823aea2f0c68eb2febb5f9672715cfa022f3a832531ae6517fbb2921814"} Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.082358 4786 generic.go:334] "Generic (PLEG): container finished" podID="0929cab2-9dd2-42c2-a91f-1b98bf72ace3" containerID="4b5b8b957b81f0ec9bae35c2e93f4d0bb5533c11a74e67fc492da68fb8436ba9" exitCode=0 Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.082408 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkn49" event={"ID":"0929cab2-9dd2-42c2-a91f-1b98bf72ace3","Type":"ContainerDied","Data":"4b5b8b957b81f0ec9bae35c2e93f4d0bb5533c11a74e67fc492da68fb8436ba9"} Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.082425 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fkn49" event={"ID":"0929cab2-9dd2-42c2-a91f-1b98bf72ace3","Type":"ContainerDied","Data":"5ccaab8d4876694ce729139a7c039be57c324955fe400981f0eca869cc9c9aeb"} Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.082835 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fkn49" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.085343 4786 generic.go:334] "Generic (PLEG): container finished" podID="074fa372-0ddd-47ea-a3ad-9203d7574875" containerID="904400a2901e0309e54f5aad36d44946706b89e0fc39c2118affb96fd53457ec" exitCode=0 Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.085378 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9mns" event={"ID":"074fa372-0ddd-47ea-a3ad-9203d7574875","Type":"ContainerDied","Data":"904400a2901e0309e54f5aad36d44946706b89e0fc39c2118affb96fd53457ec"} Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.085400 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c9mns" event={"ID":"074fa372-0ddd-47ea-a3ad-9203d7574875","Type":"ContainerDied","Data":"3341053ddcef4e9438fda668001e1033bfb123f137a043da2c511ddd2521b76b"} Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.085450 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c9mns" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.097312 4786 scope.go:117] "RemoveContainer" containerID="a584ecdd33a86d03371aa96861262470eb05eff630e58cf89e473a36a838b1ba" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.107142 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tbkbh"] Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.111524 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tbkbh"] Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.116762 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fskng"] Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.120717 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fskng"] Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.120955 4786 scope.go:117] "RemoveContainer" containerID="4bdd25ff4b4bcaead72b607c9b4d89a740a5c02e03c43664346f84fba6bebf76" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.122800 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-swzbl"] Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.124731 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-swzbl"] Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.132021 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fkn49"] Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.134550 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fkn49"] Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.140612 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c9mns"] Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.142607 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c9mns"] Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.150310 4786 scope.go:117] "RemoveContainer" containerID="63cc0f8857c2d29511339f1c7b029a35cf3683f8df0ecab73566b9851bad0478" Oct 02 06:49:59 crc kubenswrapper[4786]: E1002 06:49:59.150617 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63cc0f8857c2d29511339f1c7b029a35cf3683f8df0ecab73566b9851bad0478\": container with ID starting with 63cc0f8857c2d29511339f1c7b029a35cf3683f8df0ecab73566b9851bad0478 not found: ID does not exist" containerID="63cc0f8857c2d29511339f1c7b029a35cf3683f8df0ecab73566b9851bad0478" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.150709 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63cc0f8857c2d29511339f1c7b029a35cf3683f8df0ecab73566b9851bad0478"} err="failed to get container status \"63cc0f8857c2d29511339f1c7b029a35cf3683f8df0ecab73566b9851bad0478\": rpc error: code = NotFound desc = could not find container \"63cc0f8857c2d29511339f1c7b029a35cf3683f8df0ecab73566b9851bad0478\": container with ID starting with 63cc0f8857c2d29511339f1c7b029a35cf3683f8df0ecab73566b9851bad0478 not found: ID does not exist" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.150797 4786 scope.go:117] "RemoveContainer" containerID="a584ecdd33a86d03371aa96861262470eb05eff630e58cf89e473a36a838b1ba" Oct 02 06:49:59 crc kubenswrapper[4786]: E1002 06:49:59.151212 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a584ecdd33a86d03371aa96861262470eb05eff630e58cf89e473a36a838b1ba\": container with ID starting with a584ecdd33a86d03371aa96861262470eb05eff630e58cf89e473a36a838b1ba not found: ID does not exist" containerID="a584ecdd33a86d03371aa96861262470eb05eff630e58cf89e473a36a838b1ba" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.151289 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a584ecdd33a86d03371aa96861262470eb05eff630e58cf89e473a36a838b1ba"} err="failed to get container status \"a584ecdd33a86d03371aa96861262470eb05eff630e58cf89e473a36a838b1ba\": rpc error: code = NotFound desc = could not find container \"a584ecdd33a86d03371aa96861262470eb05eff630e58cf89e473a36a838b1ba\": container with ID starting with a584ecdd33a86d03371aa96861262470eb05eff630e58cf89e473a36a838b1ba not found: ID does not exist" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.151360 4786 scope.go:117] "RemoveContainer" containerID="4bdd25ff4b4bcaead72b607c9b4d89a740a5c02e03c43664346f84fba6bebf76" Oct 02 06:49:59 crc kubenswrapper[4786]: E1002 06:49:59.153991 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bdd25ff4b4bcaead72b607c9b4d89a740a5c02e03c43664346f84fba6bebf76\": container with ID starting with 4bdd25ff4b4bcaead72b607c9b4d89a740a5c02e03c43664346f84fba6bebf76 not found: ID does not exist" containerID="4bdd25ff4b4bcaead72b607c9b4d89a740a5c02e03c43664346f84fba6bebf76" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.154208 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bdd25ff4b4bcaead72b607c9b4d89a740a5c02e03c43664346f84fba6bebf76"} err="failed to get container status \"4bdd25ff4b4bcaead72b607c9b4d89a740a5c02e03c43664346f84fba6bebf76\": rpc error: code = NotFound desc = could not find container \"4bdd25ff4b4bcaead72b607c9b4d89a740a5c02e03c43664346f84fba6bebf76\": container with ID starting with 4bdd25ff4b4bcaead72b607c9b4d89a740a5c02e03c43664346f84fba6bebf76 not found: ID does not exist" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.154271 4786 scope.go:117] "RemoveContainer" containerID="ed6d2ae866b79bb8579b4505203bd0ed396f58fd0bf166136c49de8f8a958d79" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.164048 4786 scope.go:117] "RemoveContainer" containerID="ed6d2ae866b79bb8579b4505203bd0ed396f58fd0bf166136c49de8f8a958d79" Oct 02 06:49:59 crc kubenswrapper[4786]: E1002 06:49:59.164522 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed6d2ae866b79bb8579b4505203bd0ed396f58fd0bf166136c49de8f8a958d79\": container with ID starting with ed6d2ae866b79bb8579b4505203bd0ed396f58fd0bf166136c49de8f8a958d79 not found: ID does not exist" containerID="ed6d2ae866b79bb8579b4505203bd0ed396f58fd0bf166136c49de8f8a958d79" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.164597 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed6d2ae866b79bb8579b4505203bd0ed396f58fd0bf166136c49de8f8a958d79"} err="failed to get container status \"ed6d2ae866b79bb8579b4505203bd0ed396f58fd0bf166136c49de8f8a958d79\": rpc error: code = NotFound desc = could not find container \"ed6d2ae866b79bb8579b4505203bd0ed396f58fd0bf166136c49de8f8a958d79\": container with ID starting with ed6d2ae866b79bb8579b4505203bd0ed396f58fd0bf166136c49de8f8a958d79 not found: ID does not exist" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.164722 4786 scope.go:117] "RemoveContainer" containerID="ab380e677f2d8e6d2ab374d2847c782acf84eb5b86f73694dec03c35320ad369" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.173657 4786 scope.go:117] "RemoveContainer" containerID="49c0a6809cec6adfd4e11db3a27220afcc559fc7196c863c06523e177035d57c" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.185737 4786 scope.go:117] "RemoveContainer" containerID="3aa729fa1a4abe971e93a2c5e19a7dba56ac93a833b7c27aaedbfa56749f10a4" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.195798 4786 scope.go:117] "RemoveContainer" containerID="ab380e677f2d8e6d2ab374d2847c782acf84eb5b86f73694dec03c35320ad369" Oct 02 06:49:59 crc kubenswrapper[4786]: E1002 06:49:59.196033 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab380e677f2d8e6d2ab374d2847c782acf84eb5b86f73694dec03c35320ad369\": container with ID starting with ab380e677f2d8e6d2ab374d2847c782acf84eb5b86f73694dec03c35320ad369 not found: ID does not exist" containerID="ab380e677f2d8e6d2ab374d2847c782acf84eb5b86f73694dec03c35320ad369" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.196070 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab380e677f2d8e6d2ab374d2847c782acf84eb5b86f73694dec03c35320ad369"} err="failed to get container status \"ab380e677f2d8e6d2ab374d2847c782acf84eb5b86f73694dec03c35320ad369\": rpc error: code = NotFound desc = could not find container \"ab380e677f2d8e6d2ab374d2847c782acf84eb5b86f73694dec03c35320ad369\": container with ID starting with ab380e677f2d8e6d2ab374d2847c782acf84eb5b86f73694dec03c35320ad369 not found: ID does not exist" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.196089 4786 scope.go:117] "RemoveContainer" containerID="49c0a6809cec6adfd4e11db3a27220afcc559fc7196c863c06523e177035d57c" Oct 02 06:49:59 crc kubenswrapper[4786]: E1002 06:49:59.196407 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49c0a6809cec6adfd4e11db3a27220afcc559fc7196c863c06523e177035d57c\": container with ID starting with 49c0a6809cec6adfd4e11db3a27220afcc559fc7196c863c06523e177035d57c not found: ID does not exist" containerID="49c0a6809cec6adfd4e11db3a27220afcc559fc7196c863c06523e177035d57c" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.196430 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49c0a6809cec6adfd4e11db3a27220afcc559fc7196c863c06523e177035d57c"} err="failed to get container status \"49c0a6809cec6adfd4e11db3a27220afcc559fc7196c863c06523e177035d57c\": rpc error: code = NotFound desc = could not find container \"49c0a6809cec6adfd4e11db3a27220afcc559fc7196c863c06523e177035d57c\": container with ID starting with 49c0a6809cec6adfd4e11db3a27220afcc559fc7196c863c06523e177035d57c not found: ID does not exist" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.196445 4786 scope.go:117] "RemoveContainer" containerID="3aa729fa1a4abe971e93a2c5e19a7dba56ac93a833b7c27aaedbfa56749f10a4" Oct 02 06:49:59 crc kubenswrapper[4786]: E1002 06:49:59.196684 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aa729fa1a4abe971e93a2c5e19a7dba56ac93a833b7c27aaedbfa56749f10a4\": container with ID starting with 3aa729fa1a4abe971e93a2c5e19a7dba56ac93a833b7c27aaedbfa56749f10a4 not found: ID does not exist" containerID="3aa729fa1a4abe971e93a2c5e19a7dba56ac93a833b7c27aaedbfa56749f10a4" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.196725 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa729fa1a4abe971e93a2c5e19a7dba56ac93a833b7c27aaedbfa56749f10a4"} err="failed to get container status \"3aa729fa1a4abe971e93a2c5e19a7dba56ac93a833b7c27aaedbfa56749f10a4\": rpc error: code = NotFound desc = could not find container \"3aa729fa1a4abe971e93a2c5e19a7dba56ac93a833b7c27aaedbfa56749f10a4\": container with ID starting with 3aa729fa1a4abe971e93a2c5e19a7dba56ac93a833b7c27aaedbfa56749f10a4 not found: ID does not exist" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.196737 4786 scope.go:117] "RemoveContainer" containerID="4b5b8b957b81f0ec9bae35c2e93f4d0bb5533c11a74e67fc492da68fb8436ba9" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.205147 4786 scope.go:117] "RemoveContainer" containerID="679ab8e580db5ed21477bd5af6bc58a29e2fb57925ecd32495eb4c861b712cc5" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.214982 4786 scope.go:117] "RemoveContainer" containerID="29860608d47ea5ce8fdf9b7cc84403d70aad47fc77747d605afb2d3601198764" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.223879 4786 scope.go:117] "RemoveContainer" containerID="4b5b8b957b81f0ec9bae35c2e93f4d0bb5533c11a74e67fc492da68fb8436ba9" Oct 02 06:49:59 crc kubenswrapper[4786]: E1002 06:49:59.224251 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b5b8b957b81f0ec9bae35c2e93f4d0bb5533c11a74e67fc492da68fb8436ba9\": container with ID starting with 4b5b8b957b81f0ec9bae35c2e93f4d0bb5533c11a74e67fc492da68fb8436ba9 not found: ID does not exist" containerID="4b5b8b957b81f0ec9bae35c2e93f4d0bb5533c11a74e67fc492da68fb8436ba9" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.224276 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b5b8b957b81f0ec9bae35c2e93f4d0bb5533c11a74e67fc492da68fb8436ba9"} err="failed to get container status \"4b5b8b957b81f0ec9bae35c2e93f4d0bb5533c11a74e67fc492da68fb8436ba9\": rpc error: code = NotFound desc = could not find container \"4b5b8b957b81f0ec9bae35c2e93f4d0bb5533c11a74e67fc492da68fb8436ba9\": container with ID starting with 4b5b8b957b81f0ec9bae35c2e93f4d0bb5533c11a74e67fc492da68fb8436ba9 not found: ID does not exist" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.224290 4786 scope.go:117] "RemoveContainer" containerID="679ab8e580db5ed21477bd5af6bc58a29e2fb57925ecd32495eb4c861b712cc5" Oct 02 06:49:59 crc kubenswrapper[4786]: E1002 06:49:59.224634 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"679ab8e580db5ed21477bd5af6bc58a29e2fb57925ecd32495eb4c861b712cc5\": container with ID starting with 679ab8e580db5ed21477bd5af6bc58a29e2fb57925ecd32495eb4c861b712cc5 not found: ID does not exist" containerID="679ab8e580db5ed21477bd5af6bc58a29e2fb57925ecd32495eb4c861b712cc5" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.224660 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"679ab8e580db5ed21477bd5af6bc58a29e2fb57925ecd32495eb4c861b712cc5"} err="failed to get container status \"679ab8e580db5ed21477bd5af6bc58a29e2fb57925ecd32495eb4c861b712cc5\": rpc error: code = NotFound desc = could not find container \"679ab8e580db5ed21477bd5af6bc58a29e2fb57925ecd32495eb4c861b712cc5\": container with ID starting with 679ab8e580db5ed21477bd5af6bc58a29e2fb57925ecd32495eb4c861b712cc5 not found: ID does not exist" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.224678 4786 scope.go:117] "RemoveContainer" containerID="29860608d47ea5ce8fdf9b7cc84403d70aad47fc77747d605afb2d3601198764" Oct 02 06:49:59 crc kubenswrapper[4786]: E1002 06:49:59.224915 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29860608d47ea5ce8fdf9b7cc84403d70aad47fc77747d605afb2d3601198764\": container with ID starting with 29860608d47ea5ce8fdf9b7cc84403d70aad47fc77747d605afb2d3601198764 not found: ID does not exist" containerID="29860608d47ea5ce8fdf9b7cc84403d70aad47fc77747d605afb2d3601198764" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.224935 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29860608d47ea5ce8fdf9b7cc84403d70aad47fc77747d605afb2d3601198764"} err="failed to get container status \"29860608d47ea5ce8fdf9b7cc84403d70aad47fc77747d605afb2d3601198764\": rpc error: code = NotFound desc = could not find container \"29860608d47ea5ce8fdf9b7cc84403d70aad47fc77747d605afb2d3601198764\": container with ID starting with 29860608d47ea5ce8fdf9b7cc84403d70aad47fc77747d605afb2d3601198764 not found: ID does not exist" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.224947 4786 scope.go:117] "RemoveContainer" containerID="904400a2901e0309e54f5aad36d44946706b89e0fc39c2118affb96fd53457ec" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.235085 4786 scope.go:117] "RemoveContainer" containerID="06d3706c193a9e355a46128d6629311219a9aa8ed843de7c88b4d08c6cec3319" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.243824 4786 scope.go:117] "RemoveContainer" containerID="a2395d8a4ff10b78509a7581e7d73b16485874768df4fd3a2e86fd9a7622cb61" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.252704 4786 scope.go:117] "RemoveContainer" containerID="904400a2901e0309e54f5aad36d44946706b89e0fc39c2118affb96fd53457ec" Oct 02 06:49:59 crc kubenswrapper[4786]: E1002 06:49:59.253056 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"904400a2901e0309e54f5aad36d44946706b89e0fc39c2118affb96fd53457ec\": container with ID starting with 904400a2901e0309e54f5aad36d44946706b89e0fc39c2118affb96fd53457ec not found: ID does not exist" containerID="904400a2901e0309e54f5aad36d44946706b89e0fc39c2118affb96fd53457ec" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.253103 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"904400a2901e0309e54f5aad36d44946706b89e0fc39c2118affb96fd53457ec"} err="failed to get container status \"904400a2901e0309e54f5aad36d44946706b89e0fc39c2118affb96fd53457ec\": rpc error: code = NotFound desc = could not find container \"904400a2901e0309e54f5aad36d44946706b89e0fc39c2118affb96fd53457ec\": container with ID starting with 904400a2901e0309e54f5aad36d44946706b89e0fc39c2118affb96fd53457ec not found: ID does not exist" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.253126 4786 scope.go:117] "RemoveContainer" containerID="06d3706c193a9e355a46128d6629311219a9aa8ed843de7c88b4d08c6cec3319" Oct 02 06:49:59 crc kubenswrapper[4786]: E1002 06:49:59.253356 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06d3706c193a9e355a46128d6629311219a9aa8ed843de7c88b4d08c6cec3319\": container with ID starting with 06d3706c193a9e355a46128d6629311219a9aa8ed843de7c88b4d08c6cec3319 not found: ID does not exist" containerID="06d3706c193a9e355a46128d6629311219a9aa8ed843de7c88b4d08c6cec3319" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.253374 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06d3706c193a9e355a46128d6629311219a9aa8ed843de7c88b4d08c6cec3319"} err="failed to get container status \"06d3706c193a9e355a46128d6629311219a9aa8ed843de7c88b4d08c6cec3319\": rpc error: code = NotFound desc = could not find container \"06d3706c193a9e355a46128d6629311219a9aa8ed843de7c88b4d08c6cec3319\": container with ID starting with 06d3706c193a9e355a46128d6629311219a9aa8ed843de7c88b4d08c6cec3319 not found: ID does not exist" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.253386 4786 scope.go:117] "RemoveContainer" containerID="a2395d8a4ff10b78509a7581e7d73b16485874768df4fd3a2e86fd9a7622cb61" Oct 02 06:49:59 crc kubenswrapper[4786]: E1002 06:49:59.253630 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2395d8a4ff10b78509a7581e7d73b16485874768df4fd3a2e86fd9a7622cb61\": container with ID starting with a2395d8a4ff10b78509a7581e7d73b16485874768df4fd3a2e86fd9a7622cb61 not found: ID does not exist" containerID="a2395d8a4ff10b78509a7581e7d73b16485874768df4fd3a2e86fd9a7622cb61" Oct 02 06:49:59 crc kubenswrapper[4786]: I1002 06:49:59.253656 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2395d8a4ff10b78509a7581e7d73b16485874768df4fd3a2e86fd9a7622cb61"} err="failed to get container status \"a2395d8a4ff10b78509a7581e7d73b16485874768df4fd3a2e86fd9a7622cb61\": rpc error: code = NotFound desc = could not find container \"a2395d8a4ff10b78509a7581e7d73b16485874768df4fd3a2e86fd9a7622cb61\": container with ID starting with a2395d8a4ff10b78509a7581e7d73b16485874768df4fd3a2e86fd9a7622cb61 not found: ID does not exist" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.093480 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tq2t2" event={"ID":"53de3bf0-46ae-4969-a69e-2ad45e207407","Type":"ContainerStarted","Data":"1d7cde1800ef247e54d5a3b4ce52025e2968c9c8cef62c572a89595024905710"} Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.093702 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tq2t2" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.093715 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tq2t2" event={"ID":"53de3bf0-46ae-4969-a69e-2ad45e207407","Type":"ContainerStarted","Data":"eccaa23974c3be094e8c278712c5a5db07532bc7d07de37d09071c9a416890c1"} Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.096743 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tq2t2" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.105237 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-tq2t2" podStartSLOduration=2.105223885 podStartE2EDuration="2.105223885s" podCreationTimestamp="2025-10-02 06:49:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:50:00.104922315 +0000 UTC m=+210.226105456" watchObservedRunningTime="2025-10-02 06:50:00.105223885 +0000 UTC m=+210.226407016" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.183467 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="074fa372-0ddd-47ea-a3ad-9203d7574875" path="/var/lib/kubelet/pods/074fa372-0ddd-47ea-a3ad-9203d7574875/volumes" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.184042 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0929cab2-9dd2-42c2-a91f-1b98bf72ace3" path="/var/lib/kubelet/pods/0929cab2-9dd2-42c2-a91f-1b98bf72ace3/volumes" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.184559 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68893526-5b68-42b3-8711-a11fed5996e7" path="/var/lib/kubelet/pods/68893526-5b68-42b3-8711-a11fed5996e7/volumes" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.185157 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f80061e-e327-432d-a5dd-e0e671298e44" path="/var/lib/kubelet/pods/6f80061e-e327-432d-a5dd-e0e671298e44/volumes" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.185547 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfdbf275-781b-4a7d-b943-b592d682d11a" path="/var/lib/kubelet/pods/dfdbf275-781b-4a7d-b943-b592d682d11a/volumes" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.519721 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rtbzr"] Oct 02 06:50:00 crc kubenswrapper[4786]: E1002 06:50:00.520173 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68893526-5b68-42b3-8711-a11fed5996e7" containerName="extract-content" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.520266 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="68893526-5b68-42b3-8711-a11fed5996e7" containerName="extract-content" Oct 02 06:50:00 crc kubenswrapper[4786]: E1002 06:50:00.520335 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfdbf275-781b-4a7d-b943-b592d682d11a" containerName="registry-server" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.520385 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfdbf275-781b-4a7d-b943-b592d682d11a" containerName="registry-server" Oct 02 06:50:00 crc kubenswrapper[4786]: E1002 06:50:00.520432 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f80061e-e327-432d-a5dd-e0e671298e44" containerName="marketplace-operator" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.520500 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f80061e-e327-432d-a5dd-e0e671298e44" containerName="marketplace-operator" Oct 02 06:50:00 crc kubenswrapper[4786]: E1002 06:50:00.520585 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0929cab2-9dd2-42c2-a91f-1b98bf72ace3" containerName="extract-content" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.520649 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0929cab2-9dd2-42c2-a91f-1b98bf72ace3" containerName="extract-content" Oct 02 06:50:00 crc kubenswrapper[4786]: E1002 06:50:00.520726 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfdbf275-781b-4a7d-b943-b592d682d11a" containerName="extract-content" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.520775 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfdbf275-781b-4a7d-b943-b592d682d11a" containerName="extract-content" Oct 02 06:50:00 crc kubenswrapper[4786]: E1002 06:50:00.520820 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68893526-5b68-42b3-8711-a11fed5996e7" containerName="registry-server" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.520863 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="68893526-5b68-42b3-8711-a11fed5996e7" containerName="registry-server" Oct 02 06:50:00 crc kubenswrapper[4786]: E1002 06:50:00.520906 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68893526-5b68-42b3-8711-a11fed5996e7" containerName="extract-utilities" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.520965 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="68893526-5b68-42b3-8711-a11fed5996e7" containerName="extract-utilities" Oct 02 06:50:00 crc kubenswrapper[4786]: E1002 06:50:00.521021 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074fa372-0ddd-47ea-a3ad-9203d7574875" containerName="extract-utilities" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.521066 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="074fa372-0ddd-47ea-a3ad-9203d7574875" containerName="extract-utilities" Oct 02 06:50:00 crc kubenswrapper[4786]: E1002 06:50:00.522035 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfdbf275-781b-4a7d-b943-b592d682d11a" containerName="extract-utilities" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.522082 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfdbf275-781b-4a7d-b943-b592d682d11a" containerName="extract-utilities" Oct 02 06:50:00 crc kubenswrapper[4786]: E1002 06:50:00.522157 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074fa372-0ddd-47ea-a3ad-9203d7574875" containerName="registry-server" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.522205 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="074fa372-0ddd-47ea-a3ad-9203d7574875" containerName="registry-server" Oct 02 06:50:00 crc kubenswrapper[4786]: E1002 06:50:00.522254 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0929cab2-9dd2-42c2-a91f-1b98bf72ace3" containerName="extract-utilities" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.522297 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0929cab2-9dd2-42c2-a91f-1b98bf72ace3" containerName="extract-utilities" Oct 02 06:50:00 crc kubenswrapper[4786]: E1002 06:50:00.522347 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0929cab2-9dd2-42c2-a91f-1b98bf72ace3" containerName="registry-server" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.522391 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0929cab2-9dd2-42c2-a91f-1b98bf72ace3" containerName="registry-server" Oct 02 06:50:00 crc kubenswrapper[4786]: E1002 06:50:00.522436 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074fa372-0ddd-47ea-a3ad-9203d7574875" containerName="extract-content" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.522478 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="074fa372-0ddd-47ea-a3ad-9203d7574875" containerName="extract-content" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.522628 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="68893526-5b68-42b3-8711-a11fed5996e7" containerName="registry-server" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.522711 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="074fa372-0ddd-47ea-a3ad-9203d7574875" containerName="registry-server" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.522768 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfdbf275-781b-4a7d-b943-b592d682d11a" containerName="registry-server" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.522813 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f80061e-e327-432d-a5dd-e0e671298e44" containerName="marketplace-operator" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.522861 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0929cab2-9dd2-42c2-a91f-1b98bf72ace3" containerName="registry-server" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.524630 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rtbzr" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.526158 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.526615 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rtbzr"] Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.644641 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk5qr\" (UniqueName: \"kubernetes.io/projected/cdc7bd8b-8dab-48d7-a338-a6eb79d14c13-kube-api-access-dk5qr\") pod \"redhat-marketplace-rtbzr\" (UID: \"cdc7bd8b-8dab-48d7-a338-a6eb79d14c13\") " pod="openshift-marketplace/redhat-marketplace-rtbzr" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.644685 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc7bd8b-8dab-48d7-a338-a6eb79d14c13-utilities\") pod \"redhat-marketplace-rtbzr\" (UID: \"cdc7bd8b-8dab-48d7-a338-a6eb79d14c13\") " pod="openshift-marketplace/redhat-marketplace-rtbzr" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.644744 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc7bd8b-8dab-48d7-a338-a6eb79d14c13-catalog-content\") pod \"redhat-marketplace-rtbzr\" (UID: \"cdc7bd8b-8dab-48d7-a338-a6eb79d14c13\") " pod="openshift-marketplace/redhat-marketplace-rtbzr" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.720060 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qmvfz"] Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.720994 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qmvfz" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.722636 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.725874 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qmvfz"] Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.746209 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc7bd8b-8dab-48d7-a338-a6eb79d14c13-catalog-content\") pod \"redhat-marketplace-rtbzr\" (UID: \"cdc7bd8b-8dab-48d7-a338-a6eb79d14c13\") " pod="openshift-marketplace/redhat-marketplace-rtbzr" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.746279 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88d3da98-14ac-4121-af09-590caee1d21e-catalog-content\") pod \"redhat-operators-qmvfz\" (UID: \"88d3da98-14ac-4121-af09-590caee1d21e\") " pod="openshift-marketplace/redhat-operators-qmvfz" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.746302 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk5qr\" (UniqueName: \"kubernetes.io/projected/cdc7bd8b-8dab-48d7-a338-a6eb79d14c13-kube-api-access-dk5qr\") pod \"redhat-marketplace-rtbzr\" (UID: \"cdc7bd8b-8dab-48d7-a338-a6eb79d14c13\") " pod="openshift-marketplace/redhat-marketplace-rtbzr" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.746328 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88d3da98-14ac-4121-af09-590caee1d21e-utilities\") pod \"redhat-operators-qmvfz\" (UID: \"88d3da98-14ac-4121-af09-590caee1d21e\") " pod="openshift-marketplace/redhat-operators-qmvfz" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.746345 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc7bd8b-8dab-48d7-a338-a6eb79d14c13-utilities\") pod \"redhat-marketplace-rtbzr\" (UID: \"cdc7bd8b-8dab-48d7-a338-a6eb79d14c13\") " pod="openshift-marketplace/redhat-marketplace-rtbzr" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.746421 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6h7f\" (UniqueName: \"kubernetes.io/projected/88d3da98-14ac-4121-af09-590caee1d21e-kube-api-access-c6h7f\") pod \"redhat-operators-qmvfz\" (UID: \"88d3da98-14ac-4121-af09-590caee1d21e\") " pod="openshift-marketplace/redhat-operators-qmvfz" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.746576 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdc7bd8b-8dab-48d7-a338-a6eb79d14c13-catalog-content\") pod \"redhat-marketplace-rtbzr\" (UID: \"cdc7bd8b-8dab-48d7-a338-a6eb79d14c13\") " pod="openshift-marketplace/redhat-marketplace-rtbzr" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.746626 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdc7bd8b-8dab-48d7-a338-a6eb79d14c13-utilities\") pod \"redhat-marketplace-rtbzr\" (UID: \"cdc7bd8b-8dab-48d7-a338-a6eb79d14c13\") " pod="openshift-marketplace/redhat-marketplace-rtbzr" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.760484 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk5qr\" (UniqueName: \"kubernetes.io/projected/cdc7bd8b-8dab-48d7-a338-a6eb79d14c13-kube-api-access-dk5qr\") pod \"redhat-marketplace-rtbzr\" (UID: \"cdc7bd8b-8dab-48d7-a338-a6eb79d14c13\") " pod="openshift-marketplace/redhat-marketplace-rtbzr" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.836448 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rtbzr" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.847501 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6h7f\" (UniqueName: \"kubernetes.io/projected/88d3da98-14ac-4121-af09-590caee1d21e-kube-api-access-c6h7f\") pod \"redhat-operators-qmvfz\" (UID: \"88d3da98-14ac-4121-af09-590caee1d21e\") " pod="openshift-marketplace/redhat-operators-qmvfz" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.847675 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88d3da98-14ac-4121-af09-590caee1d21e-catalog-content\") pod \"redhat-operators-qmvfz\" (UID: \"88d3da98-14ac-4121-af09-590caee1d21e\") " pod="openshift-marketplace/redhat-operators-qmvfz" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.847729 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88d3da98-14ac-4121-af09-590caee1d21e-utilities\") pod \"redhat-operators-qmvfz\" (UID: \"88d3da98-14ac-4121-af09-590caee1d21e\") " pod="openshift-marketplace/redhat-operators-qmvfz" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.848020 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88d3da98-14ac-4121-af09-590caee1d21e-catalog-content\") pod \"redhat-operators-qmvfz\" (UID: \"88d3da98-14ac-4121-af09-590caee1d21e\") " pod="openshift-marketplace/redhat-operators-qmvfz" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.848037 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88d3da98-14ac-4121-af09-590caee1d21e-utilities\") pod \"redhat-operators-qmvfz\" (UID: \"88d3da98-14ac-4121-af09-590caee1d21e\") " pod="openshift-marketplace/redhat-operators-qmvfz" Oct 02 06:50:00 crc kubenswrapper[4786]: I1002 06:50:00.861214 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6h7f\" (UniqueName: \"kubernetes.io/projected/88d3da98-14ac-4121-af09-590caee1d21e-kube-api-access-c6h7f\") pod \"redhat-operators-qmvfz\" (UID: \"88d3da98-14ac-4121-af09-590caee1d21e\") " pod="openshift-marketplace/redhat-operators-qmvfz" Oct 02 06:50:01 crc kubenswrapper[4786]: I1002 06:50:01.036185 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qmvfz" Oct 02 06:50:01 crc kubenswrapper[4786]: I1002 06:50:01.164668 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rtbzr"] Oct 02 06:50:01 crc kubenswrapper[4786]: W1002 06:50:01.173448 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdc7bd8b_8dab_48d7_a338_a6eb79d14c13.slice/crio-14e4e67e489e15f9cc9c0293816907d58d2ed5ab5347a640e54cbe8719051e00 WatchSource:0}: Error finding container 14e4e67e489e15f9cc9c0293816907d58d2ed5ab5347a640e54cbe8719051e00: Status 404 returned error can't find the container with id 14e4e67e489e15f9cc9c0293816907d58d2ed5ab5347a640e54cbe8719051e00 Oct 02 06:50:01 crc kubenswrapper[4786]: I1002 06:50:01.369967 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qmvfz"] Oct 02 06:50:01 crc kubenswrapper[4786]: W1002 06:50:01.375763 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88d3da98_14ac_4121_af09_590caee1d21e.slice/crio-cceb060b542434143c4eacc8686e23d6d7ea88e9bf7a52b1c69cbeeae6772c28 WatchSource:0}: Error finding container cceb060b542434143c4eacc8686e23d6d7ea88e9bf7a52b1c69cbeeae6772c28: Status 404 returned error can't find the container with id cceb060b542434143c4eacc8686e23d6d7ea88e9bf7a52b1c69cbeeae6772c28 Oct 02 06:50:02 crc kubenswrapper[4786]: I1002 06:50:02.105431 4786 generic.go:334] "Generic (PLEG): container finished" podID="cdc7bd8b-8dab-48d7-a338-a6eb79d14c13" containerID="31ea8e5d8ab03f648752b707fc98c728c366a165df8384a359b08235cbc5ca23" exitCode=0 Oct 02 06:50:02 crc kubenswrapper[4786]: I1002 06:50:02.105525 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rtbzr" event={"ID":"cdc7bd8b-8dab-48d7-a338-a6eb79d14c13","Type":"ContainerDied","Data":"31ea8e5d8ab03f648752b707fc98c728c366a165df8384a359b08235cbc5ca23"} Oct 02 06:50:02 crc kubenswrapper[4786]: I1002 06:50:02.105959 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rtbzr" event={"ID":"cdc7bd8b-8dab-48d7-a338-a6eb79d14c13","Type":"ContainerStarted","Data":"14e4e67e489e15f9cc9c0293816907d58d2ed5ab5347a640e54cbe8719051e00"} Oct 02 06:50:02 crc kubenswrapper[4786]: I1002 06:50:02.108991 4786 generic.go:334] "Generic (PLEG): container finished" podID="88d3da98-14ac-4121-af09-590caee1d21e" containerID="eb0810b726fc951b39d46d5260ea697ac98a621ef4bf03199399ff9ec1b36c34" exitCode=0 Oct 02 06:50:02 crc kubenswrapper[4786]: I1002 06:50:02.109779 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmvfz" event={"ID":"88d3da98-14ac-4121-af09-590caee1d21e","Type":"ContainerDied","Data":"eb0810b726fc951b39d46d5260ea697ac98a621ef4bf03199399ff9ec1b36c34"} Oct 02 06:50:02 crc kubenswrapper[4786]: I1002 06:50:02.109818 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmvfz" event={"ID":"88d3da98-14ac-4121-af09-590caee1d21e","Type":"ContainerStarted","Data":"cceb060b542434143c4eacc8686e23d6d7ea88e9bf7a52b1c69cbeeae6772c28"} Oct 02 06:50:02 crc kubenswrapper[4786]: I1002 06:50:02.923711 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kk5wh"] Oct 02 06:50:02 crc kubenswrapper[4786]: I1002 06:50:02.926411 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk5wh" Oct 02 06:50:02 crc kubenswrapper[4786]: I1002 06:50:02.927293 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kk5wh"] Oct 02 06:50:02 crc kubenswrapper[4786]: I1002 06:50:02.927630 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.070996 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw5bg\" (UniqueName: \"kubernetes.io/projected/a3a03859-c965-4381-ae79-03adc9b0e700-kube-api-access-pw5bg\") pod \"community-operators-kk5wh\" (UID: \"a3a03859-c965-4381-ae79-03adc9b0e700\") " pod="openshift-marketplace/community-operators-kk5wh" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.071030 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3a03859-c965-4381-ae79-03adc9b0e700-catalog-content\") pod \"community-operators-kk5wh\" (UID: \"a3a03859-c965-4381-ae79-03adc9b0e700\") " pod="openshift-marketplace/community-operators-kk5wh" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.071069 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3a03859-c965-4381-ae79-03adc9b0e700-utilities\") pod \"community-operators-kk5wh\" (UID: \"a3a03859-c965-4381-ae79-03adc9b0e700\") " pod="openshift-marketplace/community-operators-kk5wh" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.119523 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bg46m"] Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.120341 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bg46m" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.121554 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.126822 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bg46m"] Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.172790 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3a03859-c965-4381-ae79-03adc9b0e700-utilities\") pod \"community-operators-kk5wh\" (UID: \"a3a03859-c965-4381-ae79-03adc9b0e700\") " pod="openshift-marketplace/community-operators-kk5wh" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.172830 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2j6r\" (UniqueName: \"kubernetes.io/projected/1e115b01-78ba-4811-a0e6-24cca7bbb0f7-kube-api-access-x2j6r\") pod \"certified-operators-bg46m\" (UID: \"1e115b01-78ba-4811-a0e6-24cca7bbb0f7\") " pod="openshift-marketplace/certified-operators-bg46m" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.172882 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e115b01-78ba-4811-a0e6-24cca7bbb0f7-catalog-content\") pod \"certified-operators-bg46m\" (UID: \"1e115b01-78ba-4811-a0e6-24cca7bbb0f7\") " pod="openshift-marketplace/certified-operators-bg46m" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.172912 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3a03859-c965-4381-ae79-03adc9b0e700-catalog-content\") pod \"community-operators-kk5wh\" (UID: \"a3a03859-c965-4381-ae79-03adc9b0e700\") " pod="openshift-marketplace/community-operators-kk5wh" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.172957 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw5bg\" (UniqueName: \"kubernetes.io/projected/a3a03859-c965-4381-ae79-03adc9b0e700-kube-api-access-pw5bg\") pod \"community-operators-kk5wh\" (UID: \"a3a03859-c965-4381-ae79-03adc9b0e700\") " pod="openshift-marketplace/community-operators-kk5wh" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.173055 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e115b01-78ba-4811-a0e6-24cca7bbb0f7-utilities\") pod \"certified-operators-bg46m\" (UID: \"1e115b01-78ba-4811-a0e6-24cca7bbb0f7\") " pod="openshift-marketplace/certified-operators-bg46m" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.173174 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3a03859-c965-4381-ae79-03adc9b0e700-utilities\") pod \"community-operators-kk5wh\" (UID: \"a3a03859-c965-4381-ae79-03adc9b0e700\") " pod="openshift-marketplace/community-operators-kk5wh" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.173267 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3a03859-c965-4381-ae79-03adc9b0e700-catalog-content\") pod \"community-operators-kk5wh\" (UID: \"a3a03859-c965-4381-ae79-03adc9b0e700\") " pod="openshift-marketplace/community-operators-kk5wh" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.188018 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw5bg\" (UniqueName: \"kubernetes.io/projected/a3a03859-c965-4381-ae79-03adc9b0e700-kube-api-access-pw5bg\") pod \"community-operators-kk5wh\" (UID: \"a3a03859-c965-4381-ae79-03adc9b0e700\") " pod="openshift-marketplace/community-operators-kk5wh" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.243011 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk5wh" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.274953 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e115b01-78ba-4811-a0e6-24cca7bbb0f7-catalog-content\") pod \"certified-operators-bg46m\" (UID: \"1e115b01-78ba-4811-a0e6-24cca7bbb0f7\") " pod="openshift-marketplace/certified-operators-bg46m" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.275138 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e115b01-78ba-4811-a0e6-24cca7bbb0f7-utilities\") pod \"certified-operators-bg46m\" (UID: \"1e115b01-78ba-4811-a0e6-24cca7bbb0f7\") " pod="openshift-marketplace/certified-operators-bg46m" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.275200 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2j6r\" (UniqueName: \"kubernetes.io/projected/1e115b01-78ba-4811-a0e6-24cca7bbb0f7-kube-api-access-x2j6r\") pod \"certified-operators-bg46m\" (UID: \"1e115b01-78ba-4811-a0e6-24cca7bbb0f7\") " pod="openshift-marketplace/certified-operators-bg46m" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.275377 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e115b01-78ba-4811-a0e6-24cca7bbb0f7-catalog-content\") pod \"certified-operators-bg46m\" (UID: \"1e115b01-78ba-4811-a0e6-24cca7bbb0f7\") " pod="openshift-marketplace/certified-operators-bg46m" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.276886 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e115b01-78ba-4811-a0e6-24cca7bbb0f7-utilities\") pod \"certified-operators-bg46m\" (UID: \"1e115b01-78ba-4811-a0e6-24cca7bbb0f7\") " pod="openshift-marketplace/certified-operators-bg46m" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.290463 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2j6r\" (UniqueName: \"kubernetes.io/projected/1e115b01-78ba-4811-a0e6-24cca7bbb0f7-kube-api-access-x2j6r\") pod \"certified-operators-bg46m\" (UID: \"1e115b01-78ba-4811-a0e6-24cca7bbb0f7\") " pod="openshift-marketplace/certified-operators-bg46m" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.430919 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bg46m" Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.580831 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kk5wh"] Oct 02 06:50:03 crc kubenswrapper[4786]: W1002 06:50:03.586311 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3a03859_c965_4381_ae79_03adc9b0e700.slice/crio-7e3aa8e97e20caaf9e97a5950d32f2307f990418df6b3b64b9401c378fb2807c WatchSource:0}: Error finding container 7e3aa8e97e20caaf9e97a5950d32f2307f990418df6b3b64b9401c378fb2807c: Status 404 returned error can't find the container with id 7e3aa8e97e20caaf9e97a5950d32f2307f990418df6b3b64b9401c378fb2807c Oct 02 06:50:03 crc kubenswrapper[4786]: I1002 06:50:03.762287 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bg46m"] Oct 02 06:50:03 crc kubenswrapper[4786]: W1002 06:50:03.767387 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e115b01_78ba_4811_a0e6_24cca7bbb0f7.slice/crio-9a0534a9504e1d00ba4623c58bd5241b8005705771555410eff011db3e15a5b8 WatchSource:0}: Error finding container 9a0534a9504e1d00ba4623c58bd5241b8005705771555410eff011db3e15a5b8: Status 404 returned error can't find the container with id 9a0534a9504e1d00ba4623c58bd5241b8005705771555410eff011db3e15a5b8 Oct 02 06:50:04 crc kubenswrapper[4786]: I1002 06:50:04.117569 4786 generic.go:334] "Generic (PLEG): container finished" podID="88d3da98-14ac-4121-af09-590caee1d21e" containerID="72cf8c43432a31999f4779e575dd3d07d281431d3ce20da67cc752b8576014ca" exitCode=0 Oct 02 06:50:04 crc kubenswrapper[4786]: I1002 06:50:04.117624 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmvfz" event={"ID":"88d3da98-14ac-4121-af09-590caee1d21e","Type":"ContainerDied","Data":"72cf8c43432a31999f4779e575dd3d07d281431d3ce20da67cc752b8576014ca"} Oct 02 06:50:04 crc kubenswrapper[4786]: I1002 06:50:04.118666 4786 generic.go:334] "Generic (PLEG): container finished" podID="a3a03859-c965-4381-ae79-03adc9b0e700" containerID="a53ccc91754ffcafc97bbb5ace2013d89362e1fe4537d0f5f2f08e36943d6372" exitCode=0 Oct 02 06:50:04 crc kubenswrapper[4786]: I1002 06:50:04.118721 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk5wh" event={"ID":"a3a03859-c965-4381-ae79-03adc9b0e700","Type":"ContainerDied","Data":"a53ccc91754ffcafc97bbb5ace2013d89362e1fe4537d0f5f2f08e36943d6372"} Oct 02 06:50:04 crc kubenswrapper[4786]: I1002 06:50:04.118736 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk5wh" event={"ID":"a3a03859-c965-4381-ae79-03adc9b0e700","Type":"ContainerStarted","Data":"7e3aa8e97e20caaf9e97a5950d32f2307f990418df6b3b64b9401c378fb2807c"} Oct 02 06:50:04 crc kubenswrapper[4786]: I1002 06:50:04.121730 4786 generic.go:334] "Generic (PLEG): container finished" podID="cdc7bd8b-8dab-48d7-a338-a6eb79d14c13" containerID="14995827daa0dd1e62c2452f1cf0eddb657d7ab37e264a8d25fa458fee9f529b" exitCode=0 Oct 02 06:50:04 crc kubenswrapper[4786]: I1002 06:50:04.121759 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rtbzr" event={"ID":"cdc7bd8b-8dab-48d7-a338-a6eb79d14c13","Type":"ContainerDied","Data":"14995827daa0dd1e62c2452f1cf0eddb657d7ab37e264a8d25fa458fee9f529b"} Oct 02 06:50:04 crc kubenswrapper[4786]: I1002 06:50:04.123404 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg46m" event={"ID":"1e115b01-78ba-4811-a0e6-24cca7bbb0f7","Type":"ContainerStarted","Data":"9a0534a9504e1d00ba4623c58bd5241b8005705771555410eff011db3e15a5b8"} Oct 02 06:50:05 crc kubenswrapper[4786]: I1002 06:50:05.128649 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rtbzr" event={"ID":"cdc7bd8b-8dab-48d7-a338-a6eb79d14c13","Type":"ContainerStarted","Data":"3ed525780da954d406e9e069cc1d100e33374000b780c0e7fa25110dd515428f"} Oct 02 06:50:05 crc kubenswrapper[4786]: I1002 06:50:05.130541 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qmvfz" event={"ID":"88d3da98-14ac-4121-af09-590caee1d21e","Type":"ContainerStarted","Data":"8f318fb29ee7c5b23f937d27f9caca4c5501d80f003a173ce48262d25465dc06"} Oct 02 06:50:05 crc kubenswrapper[4786]: I1002 06:50:05.132369 4786 generic.go:334] "Generic (PLEG): container finished" podID="1e115b01-78ba-4811-a0e6-24cca7bbb0f7" containerID="e9350ab49338cacb7a0e6868f0a628a3e67eb906fe1ddb356eb02f4c98c111b6" exitCode=0 Oct 02 06:50:05 crc kubenswrapper[4786]: I1002 06:50:05.132409 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg46m" event={"ID":"1e115b01-78ba-4811-a0e6-24cca7bbb0f7","Type":"ContainerDied","Data":"e9350ab49338cacb7a0e6868f0a628a3e67eb906fe1ddb356eb02f4c98c111b6"} Oct 02 06:50:05 crc kubenswrapper[4786]: I1002 06:50:05.134590 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk5wh" event={"ID":"a3a03859-c965-4381-ae79-03adc9b0e700","Type":"ContainerStarted","Data":"6b70f30283ecd7aa8b55bf7f099c72712015dd9e7715b9db90b5bbe0cf8f21ac"} Oct 02 06:50:05 crc kubenswrapper[4786]: I1002 06:50:05.142659 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rtbzr" podStartSLOduration=2.560326821 podStartE2EDuration="5.14265009s" podCreationTimestamp="2025-10-02 06:50:00 +0000 UTC" firstStartedPulling="2025-10-02 06:50:02.107015803 +0000 UTC m=+212.228198935" lastFinishedPulling="2025-10-02 06:50:04.689339073 +0000 UTC m=+214.810522204" observedRunningTime="2025-10-02 06:50:05.139589284 +0000 UTC m=+215.260772425" watchObservedRunningTime="2025-10-02 06:50:05.14265009 +0000 UTC m=+215.263833221" Oct 02 06:50:05 crc kubenswrapper[4786]: I1002 06:50:05.179516 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qmvfz" podStartSLOduration=2.551519184 podStartE2EDuration="5.179501839s" podCreationTimestamp="2025-10-02 06:50:00 +0000 UTC" firstStartedPulling="2025-10-02 06:50:02.110284911 +0000 UTC m=+212.231468042" lastFinishedPulling="2025-10-02 06:50:04.738267566 +0000 UTC m=+214.859450697" observedRunningTime="2025-10-02 06:50:05.167335822 +0000 UTC m=+215.288518963" watchObservedRunningTime="2025-10-02 06:50:05.179501839 +0000 UTC m=+215.300684970" Oct 02 06:50:06 crc kubenswrapper[4786]: I1002 06:50:06.148831 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg46m" event={"ID":"1e115b01-78ba-4811-a0e6-24cca7bbb0f7","Type":"ContainerStarted","Data":"ff15fe0eb64b2bdb59c434e895245793e0958257584defb9a61bfe74e33b937a"} Oct 02 06:50:06 crc kubenswrapper[4786]: I1002 06:50:06.150258 4786 generic.go:334] "Generic (PLEG): container finished" podID="a3a03859-c965-4381-ae79-03adc9b0e700" containerID="6b70f30283ecd7aa8b55bf7f099c72712015dd9e7715b9db90b5bbe0cf8f21ac" exitCode=0 Oct 02 06:50:06 crc kubenswrapper[4786]: I1002 06:50:06.150546 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk5wh" event={"ID":"a3a03859-c965-4381-ae79-03adc9b0e700","Type":"ContainerDied","Data":"6b70f30283ecd7aa8b55bf7f099c72712015dd9e7715b9db90b5bbe0cf8f21ac"} Oct 02 06:50:07 crc kubenswrapper[4786]: I1002 06:50:07.156464 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk5wh" event={"ID":"a3a03859-c965-4381-ae79-03adc9b0e700","Type":"ContainerStarted","Data":"88d6ffbd2c5f74504a540963fc4415571b05bd51298705624b91104eac50b103"} Oct 02 06:50:07 crc kubenswrapper[4786]: I1002 06:50:07.158013 4786 generic.go:334] "Generic (PLEG): container finished" podID="1e115b01-78ba-4811-a0e6-24cca7bbb0f7" containerID="ff15fe0eb64b2bdb59c434e895245793e0958257584defb9a61bfe74e33b937a" exitCode=0 Oct 02 06:50:07 crc kubenswrapper[4786]: I1002 06:50:07.158047 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg46m" event={"ID":"1e115b01-78ba-4811-a0e6-24cca7bbb0f7","Type":"ContainerDied","Data":"ff15fe0eb64b2bdb59c434e895245793e0958257584defb9a61bfe74e33b937a"} Oct 02 06:50:07 crc kubenswrapper[4786]: I1002 06:50:07.170344 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kk5wh" podStartSLOduration=2.690526077 podStartE2EDuration="5.1703334s" podCreationTimestamp="2025-10-02 06:50:02 +0000 UTC" firstStartedPulling="2025-10-02 06:50:04.119558628 +0000 UTC m=+214.240741759" lastFinishedPulling="2025-10-02 06:50:06.599365951 +0000 UTC m=+216.720549082" observedRunningTime="2025-10-02 06:50:07.169087394 +0000 UTC m=+217.290270525" watchObservedRunningTime="2025-10-02 06:50:07.1703334 +0000 UTC m=+217.291516531" Oct 02 06:50:08 crc kubenswrapper[4786]: I1002 06:50:08.163753 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg46m" event={"ID":"1e115b01-78ba-4811-a0e6-24cca7bbb0f7","Type":"ContainerStarted","Data":"0910d314d37a92c4d3965f421d8026d5f4d02a3c89671659e8a42d415d2c742c"} Oct 02 06:50:08 crc kubenswrapper[4786]: I1002 06:50:08.176498 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bg46m" podStartSLOduration=2.613033739 podStartE2EDuration="5.176482301s" podCreationTimestamp="2025-10-02 06:50:03 +0000 UTC" firstStartedPulling="2025-10-02 06:50:05.133217571 +0000 UTC m=+215.254400702" lastFinishedPulling="2025-10-02 06:50:07.696666133 +0000 UTC m=+217.817849264" observedRunningTime="2025-10-02 06:50:08.17466162 +0000 UTC m=+218.295844761" watchObservedRunningTime="2025-10-02 06:50:08.176482301 +0000 UTC m=+218.297665432" Oct 02 06:50:10 crc kubenswrapper[4786]: I1002 06:50:10.836941 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rtbzr" Oct 02 06:50:10 crc kubenswrapper[4786]: I1002 06:50:10.837188 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rtbzr" Oct 02 06:50:10 crc kubenswrapper[4786]: I1002 06:50:10.862267 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rtbzr" Oct 02 06:50:11 crc kubenswrapper[4786]: I1002 06:50:11.036597 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qmvfz" Oct 02 06:50:11 crc kubenswrapper[4786]: I1002 06:50:11.036730 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qmvfz" Oct 02 06:50:11 crc kubenswrapper[4786]: I1002 06:50:11.059792 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qmvfz" Oct 02 06:50:11 crc kubenswrapper[4786]: I1002 06:50:11.205023 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rtbzr" Oct 02 06:50:11 crc kubenswrapper[4786]: I1002 06:50:11.205138 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qmvfz" Oct 02 06:50:13 crc kubenswrapper[4786]: I1002 06:50:13.243728 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kk5wh" Oct 02 06:50:13 crc kubenswrapper[4786]: I1002 06:50:13.243768 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kk5wh" Oct 02 06:50:13 crc kubenswrapper[4786]: I1002 06:50:13.269026 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kk5wh" Oct 02 06:50:13 crc kubenswrapper[4786]: I1002 06:50:13.431524 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bg46m" Oct 02 06:50:13 crc kubenswrapper[4786]: I1002 06:50:13.431585 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bg46m" Oct 02 06:50:13 crc kubenswrapper[4786]: I1002 06:50:13.454976 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bg46m" Oct 02 06:50:14 crc kubenswrapper[4786]: I1002 06:50:14.215970 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kk5wh" Oct 02 06:50:14 crc kubenswrapper[4786]: I1002 06:50:14.216489 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bg46m" Oct 02 06:51:57 crc kubenswrapper[4786]: I1002 06:51:57.497067 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 06:51:57 crc kubenswrapper[4786]: I1002 06:51:57.497415 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 06:52:27 crc kubenswrapper[4786]: I1002 06:52:27.498344 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 06:52:27 crc kubenswrapper[4786]: I1002 06:52:27.501979 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 06:52:57 crc kubenswrapper[4786]: I1002 06:52:57.497786 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 06:52:57 crc kubenswrapper[4786]: I1002 06:52:57.498169 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 06:52:57 crc kubenswrapper[4786]: I1002 06:52:57.498205 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 06:52:57 crc kubenswrapper[4786]: I1002 06:52:57.498544 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c64abc9152933569ee60a2038ba082fca146fe30b68dc264add9f4e59c75ef2"} pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 06:52:57 crc kubenswrapper[4786]: I1002 06:52:57.498597 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" containerID="cri-o://2c64abc9152933569ee60a2038ba082fca146fe30b68dc264add9f4e59c75ef2" gracePeriod=600 Oct 02 06:52:57 crc kubenswrapper[4786]: I1002 06:52:57.767195 4786 generic.go:334] "Generic (PLEG): container finished" podID="79cb22df-4930-4aed-9108-1056074d1000" containerID="2c64abc9152933569ee60a2038ba082fca146fe30b68dc264add9f4e59c75ef2" exitCode=0 Oct 02 06:52:57 crc kubenswrapper[4786]: I1002 06:52:57.767266 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerDied","Data":"2c64abc9152933569ee60a2038ba082fca146fe30b68dc264add9f4e59c75ef2"} Oct 02 06:52:57 crc kubenswrapper[4786]: I1002 06:52:57.767377 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerStarted","Data":"6b5bd4f7ef853564be38c5c22d2f88f290b16d0915727305dc89bbed1ec9a81c"} Oct 02 06:52:57 crc kubenswrapper[4786]: I1002 06:52:57.767398 4786 scope.go:117] "RemoveContainer" containerID="41a7460f850ad3b625f5bb44eb2d6bd1c317561582892740c7362cb9ad9ed49e" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.676701 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tfj6r"] Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.677649 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.687092 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tfj6r"] Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.837769 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a0d5284-a354-47e2-9b98-4744688510f8-trusted-ca\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.837806 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a0d5284-a354-47e2-9b98-4744688510f8-registry-certificates\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.837858 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a0d5284-a354-47e2-9b98-4744688510f8-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.837940 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j79qs\" (UniqueName: \"kubernetes.io/projected/0a0d5284-a354-47e2-9b98-4744688510f8-kube-api-access-j79qs\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.837979 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a0d5284-a354-47e2-9b98-4744688510f8-bound-sa-token\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.838001 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a0d5284-a354-47e2-9b98-4744688510f8-registry-tls\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.838079 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a0d5284-a354-47e2-9b98-4744688510f8-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.838124 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.852903 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.939807 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a0d5284-a354-47e2-9b98-4744688510f8-trusted-ca\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.939843 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a0d5284-a354-47e2-9b98-4744688510f8-registry-certificates\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.939911 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a0d5284-a354-47e2-9b98-4744688510f8-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.939950 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j79qs\" (UniqueName: \"kubernetes.io/projected/0a0d5284-a354-47e2-9b98-4744688510f8-kube-api-access-j79qs\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.939968 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a0d5284-a354-47e2-9b98-4744688510f8-bound-sa-token\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.939984 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a0d5284-a354-47e2-9b98-4744688510f8-registry-tls\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.940029 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a0d5284-a354-47e2-9b98-4744688510f8-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.940828 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a0d5284-a354-47e2-9b98-4744688510f8-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.941378 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a0d5284-a354-47e2-9b98-4744688510f8-trusted-ca\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.941919 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a0d5284-a354-47e2-9b98-4744688510f8-registry-certificates\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.944514 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a0d5284-a354-47e2-9b98-4744688510f8-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.944542 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a0d5284-a354-47e2-9b98-4744688510f8-registry-tls\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.952838 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a0d5284-a354-47e2-9b98-4744688510f8-bound-sa-token\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.953476 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j79qs\" (UniqueName: \"kubernetes.io/projected/0a0d5284-a354-47e2-9b98-4744688510f8-kube-api-access-j79qs\") pod \"image-registry-66df7c8f76-tfj6r\" (UID: \"0a0d5284-a354-47e2-9b98-4744688510f8\") " pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:53 crc kubenswrapper[4786]: I1002 06:53:53.990992 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:54 crc kubenswrapper[4786]: I1002 06:53:54.319298 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tfj6r"] Oct 02 06:53:54 crc kubenswrapper[4786]: I1002 06:53:54.982654 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" event={"ID":"0a0d5284-a354-47e2-9b98-4744688510f8","Type":"ContainerStarted","Data":"1f51a8f77016f3aa929aa5bddfb557ce81715277593011103c155c31dd358b61"} Oct 02 06:53:54 crc kubenswrapper[4786]: I1002 06:53:54.982701 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" event={"ID":"0a0d5284-a354-47e2-9b98-4744688510f8","Type":"ContainerStarted","Data":"3ebb48caa4376043854e8e30129b9a967e682297dc16d619cb78a06df44b92a5"} Oct 02 06:53:54 crc kubenswrapper[4786]: I1002 06:53:54.983430 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:53:54 crc kubenswrapper[4786]: I1002 06:53:54.995041 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" podStartSLOduration=1.995028643 podStartE2EDuration="1.995028643s" podCreationTimestamp="2025-10-02 06:53:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:53:54.993936422 +0000 UTC m=+445.115119564" watchObservedRunningTime="2025-10-02 06:53:54.995028643 +0000 UTC m=+445.116211775" Oct 02 06:54:13 crc kubenswrapper[4786]: I1002 06:54:13.995011 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-tfj6r" Oct 02 06:54:14 crc kubenswrapper[4786]: I1002 06:54:14.025876 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-66zdg"] Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.689617 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-8tlx5"] Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.691807 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-8tlx5" Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.693332 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.693788 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.697041 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd5v5\" (UniqueName: \"kubernetes.io/projected/5286d0c8-0ce3-4d66-882f-9ffea6c90fa4-kube-api-access-rd5v5\") pod \"cert-manager-cainjector-7f985d654d-8tlx5\" (UID: \"5286d0c8-0ce3-4d66-882f-9ffea6c90fa4\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-8tlx5" Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.707069 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-c7bxl"] Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.708370 4786 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-9n7tv" Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.708569 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-c7bxl" Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.710888 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-8tlx5"] Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.711247 4786 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-klq94" Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.714256 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-9wd5r"] Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.715441 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-9wd5r" Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.717000 4786 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pcsz4" Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.717839 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-c7bxl"] Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.724710 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-9wd5r"] Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.798890 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwrq9\" (UniqueName: \"kubernetes.io/projected/ca1f4166-adf4-4281-925f-224930e8f775-kube-api-access-xwrq9\") pod \"cert-manager-webhook-5655c58dd6-9wd5r\" (UID: \"ca1f4166-adf4-4281-925f-224930e8f775\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-9wd5r" Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.799106 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4sj4\" (UniqueName: \"kubernetes.io/projected/27ce8ca7-3393-43c6-ac0e-6e4128f84527-kube-api-access-p4sj4\") pod \"cert-manager-5b446d88c5-c7bxl\" (UID: \"27ce8ca7-3393-43c6-ac0e-6e4128f84527\") " pod="cert-manager/cert-manager-5b446d88c5-c7bxl" Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.799218 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd5v5\" (UniqueName: \"kubernetes.io/projected/5286d0c8-0ce3-4d66-882f-9ffea6c90fa4-kube-api-access-rd5v5\") pod \"cert-manager-cainjector-7f985d654d-8tlx5\" (UID: \"5286d0c8-0ce3-4d66-882f-9ffea6c90fa4\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-8tlx5" Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.815711 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd5v5\" (UniqueName: \"kubernetes.io/projected/5286d0c8-0ce3-4d66-882f-9ffea6c90fa4-kube-api-access-rd5v5\") pod \"cert-manager-cainjector-7f985d654d-8tlx5\" (UID: \"5286d0c8-0ce3-4d66-882f-9ffea6c90fa4\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-8tlx5" Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.900348 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwrq9\" (UniqueName: \"kubernetes.io/projected/ca1f4166-adf4-4281-925f-224930e8f775-kube-api-access-xwrq9\") pod \"cert-manager-webhook-5655c58dd6-9wd5r\" (UID: \"ca1f4166-adf4-4281-925f-224930e8f775\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-9wd5r" Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.900417 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4sj4\" (UniqueName: \"kubernetes.io/projected/27ce8ca7-3393-43c6-ac0e-6e4128f84527-kube-api-access-p4sj4\") pod \"cert-manager-5b446d88c5-c7bxl\" (UID: \"27ce8ca7-3393-43c6-ac0e-6e4128f84527\") " pod="cert-manager/cert-manager-5b446d88c5-c7bxl" Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.914605 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwrq9\" (UniqueName: \"kubernetes.io/projected/ca1f4166-adf4-4281-925f-224930e8f775-kube-api-access-xwrq9\") pod \"cert-manager-webhook-5655c58dd6-9wd5r\" (UID: \"ca1f4166-adf4-4281-925f-224930e8f775\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-9wd5r" Oct 02 06:54:30 crc kubenswrapper[4786]: I1002 06:54:30.915615 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4sj4\" (UniqueName: \"kubernetes.io/projected/27ce8ca7-3393-43c6-ac0e-6e4128f84527-kube-api-access-p4sj4\") pod \"cert-manager-5b446d88c5-c7bxl\" (UID: \"27ce8ca7-3393-43c6-ac0e-6e4128f84527\") " pod="cert-manager/cert-manager-5b446d88c5-c7bxl" Oct 02 06:54:31 crc kubenswrapper[4786]: I1002 06:54:31.003895 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-8tlx5" Oct 02 06:54:31 crc kubenswrapper[4786]: I1002 06:54:31.019216 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-c7bxl" Oct 02 06:54:31 crc kubenswrapper[4786]: I1002 06:54:31.029861 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-9wd5r" Oct 02 06:54:31 crc kubenswrapper[4786]: I1002 06:54:31.357042 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-8tlx5"] Oct 02 06:54:31 crc kubenswrapper[4786]: I1002 06:54:31.367881 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 06:54:31 crc kubenswrapper[4786]: I1002 06:54:31.397537 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-9wd5r"] Oct 02 06:54:31 crc kubenswrapper[4786]: I1002 06:54:31.398825 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-c7bxl"] Oct 02 06:54:31 crc kubenswrapper[4786]: W1002 06:54:31.403954 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca1f4166_adf4_4281_925f_224930e8f775.slice/crio-31fad347946fb9cce631dc4cbf810f7555a85ab1b52cfe3127120ce790651447 WatchSource:0}: Error finding container 31fad347946fb9cce631dc4cbf810f7555a85ab1b52cfe3127120ce790651447: Status 404 returned error can't find the container with id 31fad347946fb9cce631dc4cbf810f7555a85ab1b52cfe3127120ce790651447 Oct 02 06:54:31 crc kubenswrapper[4786]: W1002 06:54:31.404805 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27ce8ca7_3393_43c6_ac0e_6e4128f84527.slice/crio-32f9307410c19fa70a7a446c238c5addfa031c6f542ffa1042149ed9f37090de WatchSource:0}: Error finding container 32f9307410c19fa70a7a446c238c5addfa031c6f542ffa1042149ed9f37090de: Status 404 returned error can't find the container with id 32f9307410c19fa70a7a446c238c5addfa031c6f542ffa1042149ed9f37090de Oct 02 06:54:32 crc kubenswrapper[4786]: I1002 06:54:32.125792 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-c7bxl" event={"ID":"27ce8ca7-3393-43c6-ac0e-6e4128f84527","Type":"ContainerStarted","Data":"32f9307410c19fa70a7a446c238c5addfa031c6f542ffa1042149ed9f37090de"} Oct 02 06:54:32 crc kubenswrapper[4786]: I1002 06:54:32.127188 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-8tlx5" event={"ID":"5286d0c8-0ce3-4d66-882f-9ffea6c90fa4","Type":"ContainerStarted","Data":"00f2049afde954eb6bcd0041a50fb749563fc8a55562e6264611f73822bc0d42"} Oct 02 06:54:32 crc kubenswrapper[4786]: I1002 06:54:32.128068 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-9wd5r" event={"ID":"ca1f4166-adf4-4281-925f-224930e8f775","Type":"ContainerStarted","Data":"31fad347946fb9cce631dc4cbf810f7555a85ab1b52cfe3127120ce790651447"} Oct 02 06:54:34 crc kubenswrapper[4786]: I1002 06:54:34.150137 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-c7bxl" event={"ID":"27ce8ca7-3393-43c6-ac0e-6e4128f84527","Type":"ContainerStarted","Data":"0ee420728bb166fde3cf85239d39ab1a81d23cd7763d658099bdd8d36e470893"} Oct 02 06:54:34 crc kubenswrapper[4786]: I1002 06:54:34.151929 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-8tlx5" event={"ID":"5286d0c8-0ce3-4d66-882f-9ffea6c90fa4","Type":"ContainerStarted","Data":"a29759b5155cfd76d55bf0b80ba0c16c3d48e33c4d838c63e43b3f92200bb67d"} Oct 02 06:54:34 crc kubenswrapper[4786]: I1002 06:54:34.164471 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-c7bxl" podStartSLOduration=2.034455416 podStartE2EDuration="4.164457296s" podCreationTimestamp="2025-10-02 06:54:30 +0000 UTC" firstStartedPulling="2025-10-02 06:54:31.406441391 +0000 UTC m=+481.527624522" lastFinishedPulling="2025-10-02 06:54:33.536443271 +0000 UTC m=+483.657626402" observedRunningTime="2025-10-02 06:54:34.160856575 +0000 UTC m=+484.282039716" watchObservedRunningTime="2025-10-02 06:54:34.164457296 +0000 UTC m=+484.285640447" Oct 02 06:54:34 crc kubenswrapper[4786]: I1002 06:54:34.172243 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-8tlx5" podStartSLOduration=2.028614113 podStartE2EDuration="4.172229456s" podCreationTimestamp="2025-10-02 06:54:30 +0000 UTC" firstStartedPulling="2025-10-02 06:54:31.367630676 +0000 UTC m=+481.488813807" lastFinishedPulling="2025-10-02 06:54:33.511246019 +0000 UTC m=+483.632429150" observedRunningTime="2025-10-02 06:54:34.171508946 +0000 UTC m=+484.292692087" watchObservedRunningTime="2025-10-02 06:54:34.172229456 +0000 UTC m=+484.293412587" Oct 02 06:54:35 crc kubenswrapper[4786]: I1002 06:54:35.158598 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-9wd5r" event={"ID":"ca1f4166-adf4-4281-925f-224930e8f775","Type":"ContainerStarted","Data":"4470136dd04d303a732230e9591b03ac4778270b802195aaca569d6aa3ad5318"} Oct 02 06:54:35 crc kubenswrapper[4786]: I1002 06:54:35.168497 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-9wd5r" podStartSLOduration=2.183929779 podStartE2EDuration="5.168488242s" podCreationTimestamp="2025-10-02 06:54:30 +0000 UTC" firstStartedPulling="2025-10-02 06:54:31.40596929 +0000 UTC m=+481.527152421" lastFinishedPulling="2025-10-02 06:54:34.390527752 +0000 UTC m=+484.511710884" observedRunningTime="2025-10-02 06:54:35.167456143 +0000 UTC m=+485.288639274" watchObservedRunningTime="2025-10-02 06:54:35.168488242 +0000 UTC m=+485.289671373" Oct 02 06:54:36 crc kubenswrapper[4786]: I1002 06:54:36.031001 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-9wd5r" Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.049204 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" podUID="f37c1514-a1f1-44b3-949d-51d1b5d4ae6e" containerName="registry" containerID="cri-o://732f58723a5dbe464dbb8d7754ca3725e70115056af00447e118d68170979533" gracePeriod=30 Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.175012 4786 generic.go:334] "Generic (PLEG): container finished" podID="f37c1514-a1f1-44b3-949d-51d1b5d4ae6e" containerID="732f58723a5dbe464dbb8d7754ca3725e70115056af00447e118d68170979533" exitCode=0 Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.175048 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" event={"ID":"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e","Type":"ContainerDied","Data":"732f58723a5dbe464dbb8d7754ca3725e70115056af00447e118d68170979533"} Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.349842 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.391510 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-trusted-ca\") pod \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.391734 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.391772 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4snl9\" (UniqueName: \"kubernetes.io/projected/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-kube-api-access-4snl9\") pod \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.392023 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-installation-pull-secrets\") pod \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.392050 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-registry-tls\") pod \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.392108 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-registry-certificates\") pod \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.392136 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-bound-sa-token\") pod \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.392153 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-ca-trust-extracted\") pod \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\" (UID: \"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e\") " Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.392272 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.393235 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.397468 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.397560 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-kube-api-access-4snl9" (OuterVolumeSpecName: "kube-api-access-4snl9") pod "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e"). InnerVolumeSpecName "kube-api-access-4snl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.397739 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.398235 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.399031 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.405397 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e" (UID: "f37c1514-a1f1-44b3-949d-51d1b5d4ae6e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.493500 4786 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.493638 4786 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.493720 4786 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.493771 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.493850 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4snl9\" (UniqueName: \"kubernetes.io/projected/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-kube-api-access-4snl9\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.493900 4786 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:39 crc kubenswrapper[4786]: I1002 06:54:39.493951 4786 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:40 crc kubenswrapper[4786]: I1002 06:54:40.181806 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" Oct 02 06:54:40 crc kubenswrapper[4786]: I1002 06:54:40.184080 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-66zdg" event={"ID":"f37c1514-a1f1-44b3-949d-51d1b5d4ae6e","Type":"ContainerDied","Data":"ae13da93cb97fbca5e20562e2557ba48aa2cea739c07578efbad754a7a235e58"} Oct 02 06:54:40 crc kubenswrapper[4786]: I1002 06:54:40.184127 4786 scope.go:117] "RemoveContainer" containerID="732f58723a5dbe464dbb8d7754ca3725e70115056af00447e118d68170979533" Oct 02 06:54:40 crc kubenswrapper[4786]: I1002 06:54:40.214377 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-66zdg"] Oct 02 06:54:40 crc kubenswrapper[4786]: I1002 06:54:40.217294 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-66zdg"] Oct 02 06:54:41 crc kubenswrapper[4786]: I1002 06:54:41.033062 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-9wd5r" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.190739 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f37c1514-a1f1-44b3-949d-51d1b5d4ae6e" path="/var/lib/kubelet/pods/f37c1514-a1f1-44b3-949d-51d1b5d4ae6e/volumes" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.243940 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bgs8z"] Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.244254 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovn-controller" containerID="cri-o://696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01" gracePeriod=30 Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.244379 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="northd" containerID="cri-o://c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc" gracePeriod=30 Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.244449 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8" gracePeriod=30 Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.244449 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="kube-rbac-proxy-node" containerID="cri-o://0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872" gracePeriod=30 Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.244443 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovn-acl-logging" containerID="cri-o://3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f" gracePeriod=30 Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.244485 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="nbdb" containerID="cri-o://ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8" gracePeriod=30 Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.244487 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="sbdb" containerID="cri-o://b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256" gracePeriod=30 Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.294202 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovnkube-controller" containerID="cri-o://c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80" gracePeriod=30 Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.541103 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bgs8z_894eab78-90cf-4975-aa45-223332e04f5c/ovnkube-controller/3.log" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.543146 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bgs8z_894eab78-90cf-4975-aa45-223332e04f5c/ovn-acl-logging/0.log" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.543576 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bgs8z_894eab78-90cf-4975-aa45-223332e04f5c/ovn-controller/0.log" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.544006 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.583904 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hqx6k"] Oct 02 06:54:42 crc kubenswrapper[4786]: E1002 06:54:42.584110 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="kubecfg-setup" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584127 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="kubecfg-setup" Oct 02 06:54:42 crc kubenswrapper[4786]: E1002 06:54:42.584139 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovnkube-controller" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584145 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovnkube-controller" Oct 02 06:54:42 crc kubenswrapper[4786]: E1002 06:54:42.584152 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovnkube-controller" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584157 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovnkube-controller" Oct 02 06:54:42 crc kubenswrapper[4786]: E1002 06:54:42.584166 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovn-controller" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584171 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovn-controller" Oct 02 06:54:42 crc kubenswrapper[4786]: E1002 06:54:42.584181 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f37c1514-a1f1-44b3-949d-51d1b5d4ae6e" containerName="registry" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584186 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37c1514-a1f1-44b3-949d-51d1b5d4ae6e" containerName="registry" Oct 02 06:54:42 crc kubenswrapper[4786]: E1002 06:54:42.584192 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovn-acl-logging" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584197 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovn-acl-logging" Oct 02 06:54:42 crc kubenswrapper[4786]: E1002 06:54:42.584202 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="kube-rbac-proxy-node" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584207 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="kube-rbac-proxy-node" Oct 02 06:54:42 crc kubenswrapper[4786]: E1002 06:54:42.584214 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="nbdb" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584220 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="nbdb" Oct 02 06:54:42 crc kubenswrapper[4786]: E1002 06:54:42.584226 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584231 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 06:54:42 crc kubenswrapper[4786]: E1002 06:54:42.584240 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="northd" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584246 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="northd" Oct 02 06:54:42 crc kubenswrapper[4786]: E1002 06:54:42.584251 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="sbdb" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584256 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="sbdb" Oct 02 06:54:42 crc kubenswrapper[4786]: E1002 06:54:42.584265 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovnkube-controller" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584272 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovnkube-controller" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584354 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="kube-rbac-proxy-node" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584363 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovnkube-controller" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584369 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovn-controller" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584375 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovnkube-controller" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584382 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="northd" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584388 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="nbdb" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584393 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584401 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovnkube-controller" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584406 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovn-acl-logging" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584411 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f37c1514-a1f1-44b3-949d-51d1b5d4ae6e" containerName="registry" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584419 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="sbdb" Oct 02 06:54:42 crc kubenswrapper[4786]: E1002 06:54:42.584487 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovnkube-controller" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584493 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovnkube-controller" Oct 02 06:54:42 crc kubenswrapper[4786]: E1002 06:54:42.584506 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovnkube-controller" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584512 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovnkube-controller" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584596 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovnkube-controller" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.584606 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="894eab78-90cf-4975-aa45-223332e04f5c" containerName="ovnkube-controller" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.585870 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.619721 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.619779 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-run-systemd\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.619775 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.619848 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-run-netns\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.619874 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.619927 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-kubelet\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.619949 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-cni-netd\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620012 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-slash\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620032 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620043 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/894eab78-90cf-4975-aa45-223332e04f5c-ovnkube-script-lib\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620065 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620068 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-slash" (OuterVolumeSpecName: "host-slash") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620188 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/086df1cb-8c1e-4f8f-9895-a98b20f151bb-env-overrides\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620233 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-systemd-units\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620249 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-cni-bin\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620279 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-run-netns\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620292 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-cni-netd\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620310 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-run-systemd\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620348 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/086df1cb-8c1e-4f8f-9895-a98b20f151bb-ovnkube-config\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620565 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/894eab78-90cf-4975-aa45-223332e04f5c-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620608 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-run-openvswitch\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620668 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-node-log\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620728 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-var-lib-openvswitch\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620776 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-run-ovn\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620806 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-log-socket\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620872 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-slash\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620925 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/086df1cb-8c1e-4f8f-9895-a98b20f151bb-ovnkube-script-lib\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.620981 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-etc-openvswitch\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.621005 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-kubelet\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.621049 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-run-ovn-kubernetes\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.621111 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.621153 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqzlp\" (UniqueName: \"kubernetes.io/projected/086df1cb-8c1e-4f8f-9895-a98b20f151bb-kube-api-access-dqzlp\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.621173 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/086df1cb-8c1e-4f8f-9895-a98b20f151bb-ovn-node-metrics-cert\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.621234 4786 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.621249 4786 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.621259 4786 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.621268 4786 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.621279 4786 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-slash\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.621289 4786 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/894eab78-90cf-4975-aa45-223332e04f5c-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.632275 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721443 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-systemd-units\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721474 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-run-ovn-kubernetes\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721496 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsb84\" (UniqueName: \"kubernetes.io/projected/894eab78-90cf-4975-aa45-223332e04f5c-kube-api-access-nsb84\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721514 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/894eab78-90cf-4975-aa45-223332e04f5c-ovn-node-metrics-cert\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721537 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-cni-bin\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721543 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721590 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721598 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721626 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-node-log\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721641 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-var-lib-openvswitch\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721660 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/894eab78-90cf-4975-aa45-223332e04f5c-env-overrides\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721704 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-node-log" (OuterVolumeSpecName: "node-log") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721716 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721732 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-etc-openvswitch\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721751 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-run-openvswitch\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721787 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721808 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/894eab78-90cf-4975-aa45-223332e04f5c-ovnkube-config\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721826 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-log-socket\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721866 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.721997 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/894eab78-90cf-4975-aa45-223332e04f5c-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.722023 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-log-socket" (OuterVolumeSpecName: "log-socket") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.722166 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/894eab78-90cf-4975-aa45-223332e04f5c-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.722212 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-run-ovn\") pod \"894eab78-90cf-4975-aa45-223332e04f5c\" (UID: \"894eab78-90cf-4975-aa45-223332e04f5c\") " Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.722278 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.722430 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/086df1cb-8c1e-4f8f-9895-a98b20f151bb-env-overrides\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.722459 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-systemd-units\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.722591 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-systemd-units\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.722914 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/086df1cb-8c1e-4f8f-9895-a98b20f151bb-env-overrides\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.722962 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-cni-bin\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.722983 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-cni-bin\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.722986 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-run-netns\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723020 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-cni-netd\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723026 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-run-netns\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723043 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-run-systemd\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723051 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-cni-netd\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723071 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-run-systemd\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723094 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/086df1cb-8c1e-4f8f-9895-a98b20f151bb-ovnkube-config\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723119 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-run-openvswitch\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723139 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-node-log\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723158 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-var-lib-openvswitch\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723191 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-run-openvswitch\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723206 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-node-log\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723232 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-run-ovn\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723250 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-log-socket\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723286 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-run-ovn\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723292 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-log-socket\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723313 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-var-lib-openvswitch\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723329 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-slash\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723345 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/086df1cb-8c1e-4f8f-9895-a98b20f151bb-ovnkube-script-lib\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723367 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-etc-openvswitch\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723382 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-slash\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723630 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/086df1cb-8c1e-4f8f-9895-a98b20f151bb-ovnkube-config\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723833 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/086df1cb-8c1e-4f8f-9895-a98b20f151bb-ovnkube-script-lib\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723899 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-etc-openvswitch\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723953 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-kubelet\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.723972 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-run-ovn-kubernetes\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.724016 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-kubelet\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.724051 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.724108 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.724106 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/086df1cb-8c1e-4f8f-9895-a98b20f151bb-host-run-ovn-kubernetes\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.724075 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqzlp\" (UniqueName: \"kubernetes.io/projected/086df1cb-8c1e-4f8f-9895-a98b20f151bb-kube-api-access-dqzlp\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.724153 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/086df1cb-8c1e-4f8f-9895-a98b20f151bb-ovn-node-metrics-cert\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.724203 4786 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.724212 4786 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-node-log\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.724220 4786 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.724227 4786 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/894eab78-90cf-4975-aa45-223332e04f5c-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.724234 4786 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.724658 4786 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.724675 4786 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/894eab78-90cf-4975-aa45-223332e04f5c-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.724684 4786 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-log-socket\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.724706 4786 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.724714 4786 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.724721 4786 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.724729 4786 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/894eab78-90cf-4975-aa45-223332e04f5c-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.725141 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/894eab78-90cf-4975-aa45-223332e04f5c-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.725471 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/894eab78-90cf-4975-aa45-223332e04f5c-kube-api-access-nsb84" (OuterVolumeSpecName: "kube-api-access-nsb84") pod "894eab78-90cf-4975-aa45-223332e04f5c" (UID: "894eab78-90cf-4975-aa45-223332e04f5c"). InnerVolumeSpecName "kube-api-access-nsb84". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.727471 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/086df1cb-8c1e-4f8f-9895-a98b20f151bb-ovn-node-metrics-cert\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.737016 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqzlp\" (UniqueName: \"kubernetes.io/projected/086df1cb-8c1e-4f8f-9895-a98b20f151bb-kube-api-access-dqzlp\") pod \"ovnkube-node-hqx6k\" (UID: \"086df1cb-8c1e-4f8f-9895-a98b20f151bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.825591 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsb84\" (UniqueName: \"kubernetes.io/projected/894eab78-90cf-4975-aa45-223332e04f5c-kube-api-access-nsb84\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.825915 4786 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/894eab78-90cf-4975-aa45-223332e04f5c-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:54:42 crc kubenswrapper[4786]: I1002 06:54:42.902295 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.201329 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7hgkl_de8dcd53-84d9-422e-8f18-63ea8ea75bd2/kube-multus/2.log" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.202958 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7hgkl_de8dcd53-84d9-422e-8f18-63ea8ea75bd2/kube-multus/1.log" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.203003 4786 generic.go:334] "Generic (PLEG): container finished" podID="de8dcd53-84d9-422e-8f18-63ea8ea75bd2" containerID="07b92f0dcfb6f4e4c5ff46665fe3986acd98cd8a36fe13f123938e8010eb7928" exitCode=2 Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.203086 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7hgkl" event={"ID":"de8dcd53-84d9-422e-8f18-63ea8ea75bd2","Type":"ContainerDied","Data":"07b92f0dcfb6f4e4c5ff46665fe3986acd98cd8a36fe13f123938e8010eb7928"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.203128 4786 scope.go:117] "RemoveContainer" containerID="8b5248bee859340a1496284f12f5f40320ec7bd7f0d299f47b29eed26204d6fc" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.203661 4786 scope.go:117] "RemoveContainer" containerID="07b92f0dcfb6f4e4c5ff46665fe3986acd98cd8a36fe13f123938e8010eb7928" Oct 02 06:54:43 crc kubenswrapper[4786]: E1002 06:54:43.203920 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-7hgkl_openshift-multus(de8dcd53-84d9-422e-8f18-63ea8ea75bd2)\"" pod="openshift-multus/multus-7hgkl" podUID="de8dcd53-84d9-422e-8f18-63ea8ea75bd2" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.208784 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bgs8z_894eab78-90cf-4975-aa45-223332e04f5c/ovnkube-controller/3.log" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.210377 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bgs8z_894eab78-90cf-4975-aa45-223332e04f5c/ovn-acl-logging/0.log" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.210787 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bgs8z_894eab78-90cf-4975-aa45-223332e04f5c/ovn-controller/0.log" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211039 4786 generic.go:334] "Generic (PLEG): container finished" podID="894eab78-90cf-4975-aa45-223332e04f5c" containerID="c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80" exitCode=0 Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211059 4786 generic.go:334] "Generic (PLEG): container finished" podID="894eab78-90cf-4975-aa45-223332e04f5c" containerID="b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256" exitCode=0 Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211068 4786 generic.go:334] "Generic (PLEG): container finished" podID="894eab78-90cf-4975-aa45-223332e04f5c" containerID="ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8" exitCode=0 Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211076 4786 generic.go:334] "Generic (PLEG): container finished" podID="894eab78-90cf-4975-aa45-223332e04f5c" containerID="c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc" exitCode=0 Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211082 4786 generic.go:334] "Generic (PLEG): container finished" podID="894eab78-90cf-4975-aa45-223332e04f5c" containerID="d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8" exitCode=0 Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211088 4786 generic.go:334] "Generic (PLEG): container finished" podID="894eab78-90cf-4975-aa45-223332e04f5c" containerID="0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872" exitCode=0 Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211093 4786 generic.go:334] "Generic (PLEG): container finished" podID="894eab78-90cf-4975-aa45-223332e04f5c" containerID="3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f" exitCode=143 Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211098 4786 generic.go:334] "Generic (PLEG): container finished" podID="894eab78-90cf-4975-aa45-223332e04f5c" containerID="696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01" exitCode=143 Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211142 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerDied","Data":"c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211167 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerDied","Data":"b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211178 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerDied","Data":"ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211187 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerDied","Data":"c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211195 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerDied","Data":"d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211203 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerDied","Data":"0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211213 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211222 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211227 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211232 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211237 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211241 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211246 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211326 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211332 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211351 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211359 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerDied","Data":"3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211367 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211373 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211378 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211382 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211387 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211390 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211393 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211484 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211491 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211496 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211501 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211508 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerDied","Data":"696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211518 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211524 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211530 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211535 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211549 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211564 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211569 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211573 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211578 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211582 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211589 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bgs8z" event={"ID":"894eab78-90cf-4975-aa45-223332e04f5c","Type":"ContainerDied","Data":"d5f32b54be0b5c8f03f781c6e922a4e9c3f5a50647be38cb9f7e6ae3c98cd11a"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211596 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211602 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211606 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211611 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211616 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211620 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211624 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211629 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211634 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.211639 4786 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.212434 4786 generic.go:334] "Generic (PLEG): container finished" podID="086df1cb-8c1e-4f8f-9895-a98b20f151bb" containerID="4ae71e62ec3fc3f586be5e2a9872583ca6f2287e88bfd733d137d3c4af889c63" exitCode=0 Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.212457 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" event={"ID":"086df1cb-8c1e-4f8f-9895-a98b20f151bb","Type":"ContainerDied","Data":"4ae71e62ec3fc3f586be5e2a9872583ca6f2287e88bfd733d137d3c4af889c63"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.212470 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" event={"ID":"086df1cb-8c1e-4f8f-9895-a98b20f151bb","Type":"ContainerStarted","Data":"bcd102a53361b13ffd7ee977972b2e23209ecad7870e936aab3f458378f88379"} Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.237405 4786 scope.go:117] "RemoveContainer" containerID="c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.254659 4786 scope.go:117] "RemoveContainer" containerID="7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.262864 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bgs8z"] Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.266079 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bgs8z"] Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.296573 4786 scope.go:117] "RemoveContainer" containerID="b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.309500 4786 scope.go:117] "RemoveContainer" containerID="ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.320009 4786 scope.go:117] "RemoveContainer" containerID="c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.329467 4786 scope.go:117] "RemoveContainer" containerID="d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.344386 4786 scope.go:117] "RemoveContainer" containerID="0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.354360 4786 scope.go:117] "RemoveContainer" containerID="3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.367326 4786 scope.go:117] "RemoveContainer" containerID="696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.393258 4786 scope.go:117] "RemoveContainer" containerID="0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.407895 4786 scope.go:117] "RemoveContainer" containerID="c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80" Oct 02 06:54:43 crc kubenswrapper[4786]: E1002 06:54:43.408217 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80\": container with ID starting with c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80 not found: ID does not exist" containerID="c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.408259 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80"} err="failed to get container status \"c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80\": rpc error: code = NotFound desc = could not find container \"c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80\": container with ID starting with c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.408285 4786 scope.go:117] "RemoveContainer" containerID="7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07" Oct 02 06:54:43 crc kubenswrapper[4786]: E1002 06:54:43.408516 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07\": container with ID starting with 7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07 not found: ID does not exist" containerID="7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.408543 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07"} err="failed to get container status \"7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07\": rpc error: code = NotFound desc = could not find container \"7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07\": container with ID starting with 7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.408574 4786 scope.go:117] "RemoveContainer" containerID="b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256" Oct 02 06:54:43 crc kubenswrapper[4786]: E1002 06:54:43.408834 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\": container with ID starting with b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256 not found: ID does not exist" containerID="b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.408872 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256"} err="failed to get container status \"b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\": rpc error: code = NotFound desc = could not find container \"b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\": container with ID starting with b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.408893 4786 scope.go:117] "RemoveContainer" containerID="ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8" Oct 02 06:54:43 crc kubenswrapper[4786]: E1002 06:54:43.409144 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\": container with ID starting with ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8 not found: ID does not exist" containerID="ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.409172 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8"} err="failed to get container status \"ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\": rpc error: code = NotFound desc = could not find container \"ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\": container with ID starting with ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.409188 4786 scope.go:117] "RemoveContainer" containerID="c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc" Oct 02 06:54:43 crc kubenswrapper[4786]: E1002 06:54:43.409393 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\": container with ID starting with c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc not found: ID does not exist" containerID="c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.409425 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc"} err="failed to get container status \"c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\": rpc error: code = NotFound desc = could not find container \"c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\": container with ID starting with c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.409441 4786 scope.go:117] "RemoveContainer" containerID="d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8" Oct 02 06:54:43 crc kubenswrapper[4786]: E1002 06:54:43.409654 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\": container with ID starting with d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8 not found: ID does not exist" containerID="d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.409676 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8"} err="failed to get container status \"d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\": rpc error: code = NotFound desc = could not find container \"d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\": container with ID starting with d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.409709 4786 scope.go:117] "RemoveContainer" containerID="0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872" Oct 02 06:54:43 crc kubenswrapper[4786]: E1002 06:54:43.409885 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\": container with ID starting with 0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872 not found: ID does not exist" containerID="0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.409902 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872"} err="failed to get container status \"0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\": rpc error: code = NotFound desc = could not find container \"0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\": container with ID starting with 0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.409914 4786 scope.go:117] "RemoveContainer" containerID="3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f" Oct 02 06:54:43 crc kubenswrapper[4786]: E1002 06:54:43.410069 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\": container with ID starting with 3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f not found: ID does not exist" containerID="3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.410082 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f"} err="failed to get container status \"3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\": rpc error: code = NotFound desc = could not find container \"3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\": container with ID starting with 3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.410092 4786 scope.go:117] "RemoveContainer" containerID="696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01" Oct 02 06:54:43 crc kubenswrapper[4786]: E1002 06:54:43.410238 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\": container with ID starting with 696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01 not found: ID does not exist" containerID="696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.410256 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01"} err="failed to get container status \"696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\": rpc error: code = NotFound desc = could not find container \"696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\": container with ID starting with 696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.410268 4786 scope.go:117] "RemoveContainer" containerID="0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a" Oct 02 06:54:43 crc kubenswrapper[4786]: E1002 06:54:43.410415 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\": container with ID starting with 0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a not found: ID does not exist" containerID="0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.410432 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a"} err="failed to get container status \"0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\": rpc error: code = NotFound desc = could not find container \"0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\": container with ID starting with 0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.410443 4786 scope.go:117] "RemoveContainer" containerID="c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.410594 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80"} err="failed to get container status \"c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80\": rpc error: code = NotFound desc = could not find container \"c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80\": container with ID starting with c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.410610 4786 scope.go:117] "RemoveContainer" containerID="7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.410777 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07"} err="failed to get container status \"7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07\": rpc error: code = NotFound desc = could not find container \"7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07\": container with ID starting with 7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.410793 4786 scope.go:117] "RemoveContainer" containerID="b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.410949 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256"} err="failed to get container status \"b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\": rpc error: code = NotFound desc = could not find container \"b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\": container with ID starting with b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.410965 4786 scope.go:117] "RemoveContainer" containerID="ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.411124 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8"} err="failed to get container status \"ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\": rpc error: code = NotFound desc = could not find container \"ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\": container with ID starting with ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.411142 4786 scope.go:117] "RemoveContainer" containerID="c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.411285 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc"} err="failed to get container status \"c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\": rpc error: code = NotFound desc = could not find container \"c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\": container with ID starting with c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.411300 4786 scope.go:117] "RemoveContainer" containerID="d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.411519 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8"} err="failed to get container status \"d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\": rpc error: code = NotFound desc = could not find container \"d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\": container with ID starting with d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.411539 4786 scope.go:117] "RemoveContainer" containerID="0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.411793 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872"} err="failed to get container status \"0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\": rpc error: code = NotFound desc = could not find container \"0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\": container with ID starting with 0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.411812 4786 scope.go:117] "RemoveContainer" containerID="3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.411964 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f"} err="failed to get container status \"3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\": rpc error: code = NotFound desc = could not find container \"3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\": container with ID starting with 3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.411980 4786 scope.go:117] "RemoveContainer" containerID="696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.412121 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01"} err="failed to get container status \"696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\": rpc error: code = NotFound desc = could not find container \"696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\": container with ID starting with 696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.412134 4786 scope.go:117] "RemoveContainer" containerID="0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.412274 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a"} err="failed to get container status \"0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\": rpc error: code = NotFound desc = could not find container \"0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\": container with ID starting with 0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.412286 4786 scope.go:117] "RemoveContainer" containerID="c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.412431 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80"} err="failed to get container status \"c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80\": rpc error: code = NotFound desc = could not find container \"c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80\": container with ID starting with c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.412442 4786 scope.go:117] "RemoveContainer" containerID="7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.412596 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07"} err="failed to get container status \"7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07\": rpc error: code = NotFound desc = could not find container \"7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07\": container with ID starting with 7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.412608 4786 scope.go:117] "RemoveContainer" containerID="b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.412767 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256"} err="failed to get container status \"b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\": rpc error: code = NotFound desc = could not find container \"b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\": container with ID starting with b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.412778 4786 scope.go:117] "RemoveContainer" containerID="ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.412918 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8"} err="failed to get container status \"ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\": rpc error: code = NotFound desc = could not find container \"ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\": container with ID starting with ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.412930 4786 scope.go:117] "RemoveContainer" containerID="c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.413072 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc"} err="failed to get container status \"c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\": rpc error: code = NotFound desc = could not find container \"c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\": container with ID starting with c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.413083 4786 scope.go:117] "RemoveContainer" containerID="d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.413224 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8"} err="failed to get container status \"d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\": rpc error: code = NotFound desc = could not find container \"d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\": container with ID starting with d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.413239 4786 scope.go:117] "RemoveContainer" containerID="0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.413392 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872"} err="failed to get container status \"0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\": rpc error: code = NotFound desc = could not find container \"0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\": container with ID starting with 0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.413408 4786 scope.go:117] "RemoveContainer" containerID="3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.413574 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f"} err="failed to get container status \"3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\": rpc error: code = NotFound desc = could not find container \"3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\": container with ID starting with 3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.413591 4786 scope.go:117] "RemoveContainer" containerID="696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.413761 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01"} err="failed to get container status \"696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\": rpc error: code = NotFound desc = could not find container \"696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\": container with ID starting with 696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.413778 4786 scope.go:117] "RemoveContainer" containerID="0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.413918 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a"} err="failed to get container status \"0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\": rpc error: code = NotFound desc = could not find container \"0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\": container with ID starting with 0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.413938 4786 scope.go:117] "RemoveContainer" containerID="c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.414077 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80"} err="failed to get container status \"c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80\": rpc error: code = NotFound desc = could not find container \"c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80\": container with ID starting with c5eb670f91d4faf4a640956c5fc0e6d165c16e7d97c6308139c6ac414f322e80 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.414089 4786 scope.go:117] "RemoveContainer" containerID="7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.414231 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07"} err="failed to get container status \"7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07\": rpc error: code = NotFound desc = could not find container \"7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07\": container with ID starting with 7c8dacc63e6520fb434e125ec99ef14fccb93d61332a9fa48626470dc6cf1a07 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.414244 4786 scope.go:117] "RemoveContainer" containerID="b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.414380 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256"} err="failed to get container status \"b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\": rpc error: code = NotFound desc = could not find container \"b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256\": container with ID starting with b5cdf365347f435e6260f52e302e3e4d701a095e95572cb2095eae7dc7bd0256 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.414393 4786 scope.go:117] "RemoveContainer" containerID="ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.414538 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8"} err="failed to get container status \"ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\": rpc error: code = NotFound desc = could not find container \"ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8\": container with ID starting with ae89c8afb504acb78be4f04711dc15161e38d2e1f9c3ebb6091333a5a5fb65e8 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.414551 4786 scope.go:117] "RemoveContainer" containerID="c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.414719 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc"} err="failed to get container status \"c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\": rpc error: code = NotFound desc = could not find container \"c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc\": container with ID starting with c6d93165a686696d81e8563c2a599a1c10cb82f2f4b003c7d4079659890189bc not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.414731 4786 scope.go:117] "RemoveContainer" containerID="d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.414880 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8"} err="failed to get container status \"d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\": rpc error: code = NotFound desc = could not find container \"d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8\": container with ID starting with d70d6fe80e2146e6d77b41ae60791bfc2d0f5c7170516a14a789c389eafc7cb8 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.414895 4786 scope.go:117] "RemoveContainer" containerID="0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.415046 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872"} err="failed to get container status \"0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\": rpc error: code = NotFound desc = could not find container \"0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872\": container with ID starting with 0e2e9a3f46851bcc031f3e32ed0be65bb6bc9e436042c2344fa08ae840725872 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.415062 4786 scope.go:117] "RemoveContainer" containerID="3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.415203 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f"} err="failed to get container status \"3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\": rpc error: code = NotFound desc = could not find container \"3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f\": container with ID starting with 3d6519ccfd5001250848af2a1412b22ab17ccd8035350ead64bddc21d6cdac0f not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.415218 4786 scope.go:117] "RemoveContainer" containerID="696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.415359 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01"} err="failed to get container status \"696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\": rpc error: code = NotFound desc = could not find container \"696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01\": container with ID starting with 696d77918a66206ac50cfdf86437f3701fa50b5aff79c6874229a963cd354d01 not found: ID does not exist" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.415374 4786 scope.go:117] "RemoveContainer" containerID="0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a" Oct 02 06:54:43 crc kubenswrapper[4786]: I1002 06:54:43.415512 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a"} err="failed to get container status \"0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\": rpc error: code = NotFound desc = could not find container \"0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a\": container with ID starting with 0348cd3e070b1b001a4bfe8353baf94ccd7dc21964464c14ac8ff4148e44571a not found: ID does not exist" Oct 02 06:54:44 crc kubenswrapper[4786]: I1002 06:54:44.185617 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="894eab78-90cf-4975-aa45-223332e04f5c" path="/var/lib/kubelet/pods/894eab78-90cf-4975-aa45-223332e04f5c/volumes" Oct 02 06:54:44 crc kubenswrapper[4786]: I1002 06:54:44.219205 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7hgkl_de8dcd53-84d9-422e-8f18-63ea8ea75bd2/kube-multus/2.log" Oct 02 06:54:44 crc kubenswrapper[4786]: I1002 06:54:44.223011 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" event={"ID":"086df1cb-8c1e-4f8f-9895-a98b20f151bb","Type":"ContainerStarted","Data":"2d36f5c4cf72fc70cbf0bc76404d4a73c823a377f202c0d70bde4def4328f180"} Oct 02 06:54:44 crc kubenswrapper[4786]: I1002 06:54:44.223089 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" event={"ID":"086df1cb-8c1e-4f8f-9895-a98b20f151bb","Type":"ContainerStarted","Data":"ee6a9471c67b41c0f78fd36f92473291e9740657573447deb3d8908d409f5e62"} Oct 02 06:54:44 crc kubenswrapper[4786]: I1002 06:54:44.223106 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" event={"ID":"086df1cb-8c1e-4f8f-9895-a98b20f151bb","Type":"ContainerStarted","Data":"bab6347c3a69b64748c978e61bec24281a2e5bc245d736f9573a3e346dcc9f74"} Oct 02 06:54:44 crc kubenswrapper[4786]: I1002 06:54:44.223118 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" event={"ID":"086df1cb-8c1e-4f8f-9895-a98b20f151bb","Type":"ContainerStarted","Data":"55f4e5d82f1ebcd0876ced2c81de5a913f88ce318982e9087e7899bb15bdd8ae"} Oct 02 06:54:44 crc kubenswrapper[4786]: I1002 06:54:44.223130 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" event={"ID":"086df1cb-8c1e-4f8f-9895-a98b20f151bb","Type":"ContainerStarted","Data":"e8700a34df55279afc3f0351fd009830b2c75cdb0ad55b44e5cbab63db0df535"} Oct 02 06:54:44 crc kubenswrapper[4786]: I1002 06:54:44.223141 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" event={"ID":"086df1cb-8c1e-4f8f-9895-a98b20f151bb","Type":"ContainerStarted","Data":"b581aa139af69643e3b4dda394b1dbd15038f5949204f18f4d3dc663d3951d17"} Oct 02 06:54:46 crc kubenswrapper[4786]: I1002 06:54:46.235189 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" event={"ID":"086df1cb-8c1e-4f8f-9895-a98b20f151bb","Type":"ContainerStarted","Data":"92871227fa1249f1b250494afbdd51e0761d641c77be8bf22cf4b64f0db244e1"} Oct 02 06:54:48 crc kubenswrapper[4786]: I1002 06:54:48.247002 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" event={"ID":"086df1cb-8c1e-4f8f-9895-a98b20f151bb","Type":"ContainerStarted","Data":"7000f3b42ca0b30bb1ae3ad4340fc304f10c30cd3cfee0a96134d137341c03b4"} Oct 02 06:54:48 crc kubenswrapper[4786]: I1002 06:54:48.247636 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:48 crc kubenswrapper[4786]: I1002 06:54:48.247651 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:48 crc kubenswrapper[4786]: I1002 06:54:48.275013 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:48 crc kubenswrapper[4786]: I1002 06:54:48.278296 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" podStartSLOduration=6.278282474 podStartE2EDuration="6.278282474s" podCreationTimestamp="2025-10-02 06:54:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:54:48.275149666 +0000 UTC m=+498.396332818" watchObservedRunningTime="2025-10-02 06:54:48.278282474 +0000 UTC m=+498.399465605" Oct 02 06:54:49 crc kubenswrapper[4786]: I1002 06:54:49.252487 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:49 crc kubenswrapper[4786]: I1002 06:54:49.273130 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:54:57 crc kubenswrapper[4786]: I1002 06:54:57.179372 4786 scope.go:117] "RemoveContainer" containerID="07b92f0dcfb6f4e4c5ff46665fe3986acd98cd8a36fe13f123938e8010eb7928" Oct 02 06:54:57 crc kubenswrapper[4786]: E1002 06:54:57.180154 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-7hgkl_openshift-multus(de8dcd53-84d9-422e-8f18-63ea8ea75bd2)\"" pod="openshift-multus/multus-7hgkl" podUID="de8dcd53-84d9-422e-8f18-63ea8ea75bd2" Oct 02 06:54:57 crc kubenswrapper[4786]: I1002 06:54:57.497975 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 06:54:57 crc kubenswrapper[4786]: I1002 06:54:57.498083 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 06:55:08 crc kubenswrapper[4786]: I1002 06:55:08.681380 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c"] Oct 02 06:55:08 crc kubenswrapper[4786]: I1002 06:55:08.682803 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:08 crc kubenswrapper[4786]: I1002 06:55:08.686053 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 06:55:08 crc kubenswrapper[4786]: I1002 06:55:08.693039 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c"] Oct 02 06:55:08 crc kubenswrapper[4786]: I1002 06:55:08.721051 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5490b488-b520-4906-92b6-b13a997075fb-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c\" (UID: \"5490b488-b520-4906-92b6-b13a997075fb\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:08 crc kubenswrapper[4786]: I1002 06:55:08.721094 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5490b488-b520-4906-92b6-b13a997075fb-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c\" (UID: \"5490b488-b520-4906-92b6-b13a997075fb\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:08 crc kubenswrapper[4786]: I1002 06:55:08.721170 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9zbr\" (UniqueName: \"kubernetes.io/projected/5490b488-b520-4906-92b6-b13a997075fb-kube-api-access-p9zbr\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c\" (UID: \"5490b488-b520-4906-92b6-b13a997075fb\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:08 crc kubenswrapper[4786]: I1002 06:55:08.822747 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5490b488-b520-4906-92b6-b13a997075fb-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c\" (UID: \"5490b488-b520-4906-92b6-b13a997075fb\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:08 crc kubenswrapper[4786]: I1002 06:55:08.822795 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5490b488-b520-4906-92b6-b13a997075fb-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c\" (UID: \"5490b488-b520-4906-92b6-b13a997075fb\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:08 crc kubenswrapper[4786]: I1002 06:55:08.822868 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9zbr\" (UniqueName: \"kubernetes.io/projected/5490b488-b520-4906-92b6-b13a997075fb-kube-api-access-p9zbr\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c\" (UID: \"5490b488-b520-4906-92b6-b13a997075fb\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:08 crc kubenswrapper[4786]: I1002 06:55:08.823338 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5490b488-b520-4906-92b6-b13a997075fb-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c\" (UID: \"5490b488-b520-4906-92b6-b13a997075fb\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:08 crc kubenswrapper[4786]: I1002 06:55:08.823514 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5490b488-b520-4906-92b6-b13a997075fb-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c\" (UID: \"5490b488-b520-4906-92b6-b13a997075fb\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:08 crc kubenswrapper[4786]: I1002 06:55:08.842516 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9zbr\" (UniqueName: \"kubernetes.io/projected/5490b488-b520-4906-92b6-b13a997075fb-kube-api-access-p9zbr\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c\" (UID: \"5490b488-b520-4906-92b6-b13a997075fb\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:08 crc kubenswrapper[4786]: I1002 06:55:08.996685 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:09 crc kubenswrapper[4786]: E1002 06:55:09.022629 4786 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_openshift-marketplace_5490b488-b520-4906-92b6-b13a997075fb_0(23c5591fcc78a4ea7b0e39b2c03332a27bc79750184e63c115733035b7820e30): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 06:55:09 crc kubenswrapper[4786]: E1002 06:55:09.022769 4786 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_openshift-marketplace_5490b488-b520-4906-92b6-b13a997075fb_0(23c5591fcc78a4ea7b0e39b2c03332a27bc79750184e63c115733035b7820e30): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:09 crc kubenswrapper[4786]: E1002 06:55:09.022847 4786 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_openshift-marketplace_5490b488-b520-4906-92b6-b13a997075fb_0(23c5591fcc78a4ea7b0e39b2c03332a27bc79750184e63c115733035b7820e30): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:09 crc kubenswrapper[4786]: E1002 06:55:09.022948 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_openshift-marketplace(5490b488-b520-4906-92b6-b13a997075fb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_openshift-marketplace(5490b488-b520-4906-92b6-b13a997075fb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_openshift-marketplace_5490b488-b520-4906-92b6-b13a997075fb_0(23c5591fcc78a4ea7b0e39b2c03332a27bc79750184e63c115733035b7820e30): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" podUID="5490b488-b520-4906-92b6-b13a997075fb" Oct 02 06:55:09 crc kubenswrapper[4786]: I1002 06:55:09.348851 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:09 crc kubenswrapper[4786]: I1002 06:55:09.349263 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:09 crc kubenswrapper[4786]: E1002 06:55:09.377380 4786 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_openshift-marketplace_5490b488-b520-4906-92b6-b13a997075fb_0(fcbc47092a8e1683517c23879775840e9a6f1879e8fcc412a3e8f9c396bd3ba5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 06:55:09 crc kubenswrapper[4786]: E1002 06:55:09.377452 4786 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_openshift-marketplace_5490b488-b520-4906-92b6-b13a997075fb_0(fcbc47092a8e1683517c23879775840e9a6f1879e8fcc412a3e8f9c396bd3ba5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:09 crc kubenswrapper[4786]: E1002 06:55:09.377485 4786 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_openshift-marketplace_5490b488-b520-4906-92b6-b13a997075fb_0(fcbc47092a8e1683517c23879775840e9a6f1879e8fcc412a3e8f9c396bd3ba5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:09 crc kubenswrapper[4786]: E1002 06:55:09.377555 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_openshift-marketplace(5490b488-b520-4906-92b6-b13a997075fb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_openshift-marketplace(5490b488-b520-4906-92b6-b13a997075fb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_openshift-marketplace_5490b488-b520-4906-92b6-b13a997075fb_0(fcbc47092a8e1683517c23879775840e9a6f1879e8fcc412a3e8f9c396bd3ba5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" podUID="5490b488-b520-4906-92b6-b13a997075fb" Oct 02 06:55:11 crc kubenswrapper[4786]: I1002 06:55:11.179633 4786 scope.go:117] "RemoveContainer" containerID="07b92f0dcfb6f4e4c5ff46665fe3986acd98cd8a36fe13f123938e8010eb7928" Oct 02 06:55:11 crc kubenswrapper[4786]: I1002 06:55:11.362283 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7hgkl_de8dcd53-84d9-422e-8f18-63ea8ea75bd2/kube-multus/2.log" Oct 02 06:55:11 crc kubenswrapper[4786]: I1002 06:55:11.362358 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7hgkl" event={"ID":"de8dcd53-84d9-422e-8f18-63ea8ea75bd2","Type":"ContainerStarted","Data":"5129b48cc23a7cca99df53b1a29aeeb2197ed247081cf7ea0e8716b4d462c07a"} Oct 02 06:55:12 crc kubenswrapper[4786]: I1002 06:55:12.918832 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hqx6k" Oct 02 06:55:23 crc kubenswrapper[4786]: I1002 06:55:23.179174 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:23 crc kubenswrapper[4786]: I1002 06:55:23.180505 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:23 crc kubenswrapper[4786]: I1002 06:55:23.525288 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c"] Oct 02 06:55:24 crc kubenswrapper[4786]: I1002 06:55:24.421763 4786 generic.go:334] "Generic (PLEG): container finished" podID="5490b488-b520-4906-92b6-b13a997075fb" containerID="3edcb333663c302a9ab1587c50d1f5478415952bf618b12f7ba680db0a89158e" exitCode=0 Oct 02 06:55:24 crc kubenswrapper[4786]: I1002 06:55:24.421862 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" event={"ID":"5490b488-b520-4906-92b6-b13a997075fb","Type":"ContainerDied","Data":"3edcb333663c302a9ab1587c50d1f5478415952bf618b12f7ba680db0a89158e"} Oct 02 06:55:24 crc kubenswrapper[4786]: I1002 06:55:24.422116 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" event={"ID":"5490b488-b520-4906-92b6-b13a997075fb","Type":"ContainerStarted","Data":"be73d6802be0dad911d5a6a4109adfb8c90923ff96391c601762f5d7b494f9a5"} Oct 02 06:55:26 crc kubenswrapper[4786]: I1002 06:55:26.432438 4786 generic.go:334] "Generic (PLEG): container finished" podID="5490b488-b520-4906-92b6-b13a997075fb" containerID="a0c15899324d1d003b0b644fa55881480fbdb48540376bdbe3ca70ae97cada4c" exitCode=0 Oct 02 06:55:26 crc kubenswrapper[4786]: I1002 06:55:26.432533 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" event={"ID":"5490b488-b520-4906-92b6-b13a997075fb","Type":"ContainerDied","Data":"a0c15899324d1d003b0b644fa55881480fbdb48540376bdbe3ca70ae97cada4c"} Oct 02 06:55:27 crc kubenswrapper[4786]: I1002 06:55:27.438885 4786 generic.go:334] "Generic (PLEG): container finished" podID="5490b488-b520-4906-92b6-b13a997075fb" containerID="d6b63fe1acd3834d60fe9feb93bca5237e796dd50f1029be2a9b2033927279de" exitCode=0 Oct 02 06:55:27 crc kubenswrapper[4786]: I1002 06:55:27.438933 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" event={"ID":"5490b488-b520-4906-92b6-b13a997075fb","Type":"ContainerDied","Data":"d6b63fe1acd3834d60fe9feb93bca5237e796dd50f1029be2a9b2033927279de"} Oct 02 06:55:27 crc kubenswrapper[4786]: I1002 06:55:27.497295 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 06:55:27 crc kubenswrapper[4786]: I1002 06:55:27.497354 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 06:55:28 crc kubenswrapper[4786]: I1002 06:55:28.612481 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:28 crc kubenswrapper[4786]: I1002 06:55:28.627782 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5490b488-b520-4906-92b6-b13a997075fb-util\") pod \"5490b488-b520-4906-92b6-b13a997075fb\" (UID: \"5490b488-b520-4906-92b6-b13a997075fb\") " Oct 02 06:55:28 crc kubenswrapper[4786]: I1002 06:55:28.627865 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9zbr\" (UniqueName: \"kubernetes.io/projected/5490b488-b520-4906-92b6-b13a997075fb-kube-api-access-p9zbr\") pod \"5490b488-b520-4906-92b6-b13a997075fb\" (UID: \"5490b488-b520-4906-92b6-b13a997075fb\") " Oct 02 06:55:28 crc kubenswrapper[4786]: I1002 06:55:28.633159 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5490b488-b520-4906-92b6-b13a997075fb-kube-api-access-p9zbr" (OuterVolumeSpecName: "kube-api-access-p9zbr") pod "5490b488-b520-4906-92b6-b13a997075fb" (UID: "5490b488-b520-4906-92b6-b13a997075fb"). InnerVolumeSpecName "kube-api-access-p9zbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:55:28 crc kubenswrapper[4786]: I1002 06:55:28.637975 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5490b488-b520-4906-92b6-b13a997075fb-util" (OuterVolumeSpecName: "util") pod "5490b488-b520-4906-92b6-b13a997075fb" (UID: "5490b488-b520-4906-92b6-b13a997075fb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:55:28 crc kubenswrapper[4786]: I1002 06:55:28.729019 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5490b488-b520-4906-92b6-b13a997075fb-bundle\") pod \"5490b488-b520-4906-92b6-b13a997075fb\" (UID: \"5490b488-b520-4906-92b6-b13a997075fb\") " Oct 02 06:55:28 crc kubenswrapper[4786]: I1002 06:55:28.729415 4786 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5490b488-b520-4906-92b6-b13a997075fb-util\") on node \"crc\" DevicePath \"\"" Oct 02 06:55:28 crc kubenswrapper[4786]: I1002 06:55:28.729436 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9zbr\" (UniqueName: \"kubernetes.io/projected/5490b488-b520-4906-92b6-b13a997075fb-kube-api-access-p9zbr\") on node \"crc\" DevicePath \"\"" Oct 02 06:55:28 crc kubenswrapper[4786]: I1002 06:55:28.729719 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5490b488-b520-4906-92b6-b13a997075fb-bundle" (OuterVolumeSpecName: "bundle") pod "5490b488-b520-4906-92b6-b13a997075fb" (UID: "5490b488-b520-4906-92b6-b13a997075fb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:55:28 crc kubenswrapper[4786]: I1002 06:55:28.830315 4786 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5490b488-b520-4906-92b6-b13a997075fb-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 06:55:29 crc kubenswrapper[4786]: I1002 06:55:29.449155 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" event={"ID":"5490b488-b520-4906-92b6-b13a997075fb","Type":"ContainerDied","Data":"be73d6802be0dad911d5a6a4109adfb8c90923ff96391c601762f5d7b494f9a5"} Oct 02 06:55:29 crc kubenswrapper[4786]: I1002 06:55:29.449426 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be73d6802be0dad911d5a6a4109adfb8c90923ff96391c601762f5d7b494f9a5" Oct 02 06:55:29 crc kubenswrapper[4786]: I1002 06:55:29.449217 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c" Oct 02 06:55:35 crc kubenswrapper[4786]: I1002 06:55:35.297408 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-pn52f"] Oct 02 06:55:35 crc kubenswrapper[4786]: E1002 06:55:35.297607 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5490b488-b520-4906-92b6-b13a997075fb" containerName="util" Oct 02 06:55:35 crc kubenswrapper[4786]: I1002 06:55:35.297621 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5490b488-b520-4906-92b6-b13a997075fb" containerName="util" Oct 02 06:55:35 crc kubenswrapper[4786]: E1002 06:55:35.297634 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5490b488-b520-4906-92b6-b13a997075fb" containerName="pull" Oct 02 06:55:35 crc kubenswrapper[4786]: I1002 06:55:35.297640 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5490b488-b520-4906-92b6-b13a997075fb" containerName="pull" Oct 02 06:55:35 crc kubenswrapper[4786]: E1002 06:55:35.297651 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5490b488-b520-4906-92b6-b13a997075fb" containerName="extract" Oct 02 06:55:35 crc kubenswrapper[4786]: I1002 06:55:35.297657 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5490b488-b520-4906-92b6-b13a997075fb" containerName="extract" Oct 02 06:55:35 crc kubenswrapper[4786]: I1002 06:55:35.297776 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5490b488-b520-4906-92b6-b13a997075fb" containerName="extract" Oct 02 06:55:35 crc kubenswrapper[4786]: I1002 06:55:35.298148 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pn52f" Oct 02 06:55:35 crc kubenswrapper[4786]: I1002 06:55:35.299597 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk4gl\" (UniqueName: \"kubernetes.io/projected/75a4981d-3614-4013-80e8-dcc8cd60da94-kube-api-access-hk4gl\") pod \"nmstate-operator-858ddd8f98-pn52f\" (UID: \"75a4981d-3614-4013-80e8-dcc8cd60da94\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-pn52f" Oct 02 06:55:35 crc kubenswrapper[4786]: I1002 06:55:35.299774 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-h27tk" Oct 02 06:55:35 crc kubenswrapper[4786]: I1002 06:55:35.299819 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 02 06:55:35 crc kubenswrapper[4786]: I1002 06:55:35.300785 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 02 06:55:35 crc kubenswrapper[4786]: I1002 06:55:35.304260 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-pn52f"] Oct 02 06:55:35 crc kubenswrapper[4786]: I1002 06:55:35.400570 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk4gl\" (UniqueName: \"kubernetes.io/projected/75a4981d-3614-4013-80e8-dcc8cd60da94-kube-api-access-hk4gl\") pod \"nmstate-operator-858ddd8f98-pn52f\" (UID: \"75a4981d-3614-4013-80e8-dcc8cd60da94\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-pn52f" Oct 02 06:55:35 crc kubenswrapper[4786]: I1002 06:55:35.415968 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk4gl\" (UniqueName: \"kubernetes.io/projected/75a4981d-3614-4013-80e8-dcc8cd60da94-kube-api-access-hk4gl\") pod \"nmstate-operator-858ddd8f98-pn52f\" (UID: \"75a4981d-3614-4013-80e8-dcc8cd60da94\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-pn52f" Oct 02 06:55:35 crc kubenswrapper[4786]: I1002 06:55:35.611095 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pn52f" Oct 02 06:55:35 crc kubenswrapper[4786]: I1002 06:55:35.963076 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-pn52f"] Oct 02 06:55:36 crc kubenswrapper[4786]: I1002 06:55:36.486352 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pn52f" event={"ID":"75a4981d-3614-4013-80e8-dcc8cd60da94","Type":"ContainerStarted","Data":"bd822ee90f21556c4c0544f7d8197e68c1fffe3b31a2bff555983740119a0d3a"} Oct 02 06:55:38 crc kubenswrapper[4786]: I1002 06:55:38.496956 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pn52f" event={"ID":"75a4981d-3614-4013-80e8-dcc8cd60da94","Type":"ContainerStarted","Data":"c8c77db9bcb2223847ae301c5f46d3b0621d5adb632e24ae8a6acc4e6af16078"} Oct 02 06:55:38 crc kubenswrapper[4786]: I1002 06:55:38.519149 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pn52f" podStartSLOduration=1.7673264579999999 podStartE2EDuration="3.519121884s" podCreationTimestamp="2025-10-02 06:55:35 +0000 UTC" firstStartedPulling="2025-10-02 06:55:35.969886855 +0000 UTC m=+546.091069986" lastFinishedPulling="2025-10-02 06:55:37.72168228 +0000 UTC m=+547.842865412" observedRunningTime="2025-10-02 06:55:38.507854852 +0000 UTC m=+548.629037983" watchObservedRunningTime="2025-10-02 06:55:38.519121884 +0000 UTC m=+548.640305015" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.306017 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-kgjxr"] Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.307947 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kgjxr" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.309639 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-jmvf8" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.313827 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-kgjxr"] Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.327264 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-6szkj"] Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.327992 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6szkj" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.330518 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.336711 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-kfbdc"] Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.337425 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-kfbdc" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.348523 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-6szkj"] Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.410522 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-z8sr6"] Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.411302 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z8sr6" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.412536 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.413323 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.413516 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-pb54v" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.417451 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-z8sr6"] Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.444819 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2czth\" (UniqueName: \"kubernetes.io/projected/9b208785-3928-4bc7-a6fd-1bcee5029917-kube-api-access-2czth\") pod \"nmstate-handler-kfbdc\" (UID: \"9b208785-3928-4bc7-a6fd-1bcee5029917\") " pod="openshift-nmstate/nmstate-handler-kfbdc" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.444958 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wzcr\" (UniqueName: \"kubernetes.io/projected/fe88424e-e46c-46b3-a2e0-7bba5ef147b3-kube-api-access-2wzcr\") pod \"nmstate-console-plugin-6b874cbd85-z8sr6\" (UID: \"fe88424e-e46c-46b3-a2e0-7bba5ef147b3\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z8sr6" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.445058 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9376f046-2ecd-4c20-bf82-4f18490d91d9-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-6szkj\" (UID: \"9376f046-2ecd-4c20-bf82-4f18490d91d9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6szkj" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.445140 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9b208785-3928-4bc7-a6fd-1bcee5029917-dbus-socket\") pod \"nmstate-handler-kfbdc\" (UID: \"9b208785-3928-4bc7-a6fd-1bcee5029917\") " pod="openshift-nmstate/nmstate-handler-kfbdc" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.445247 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l99n2\" (UniqueName: \"kubernetes.io/projected/84995ef7-937a-442e-a016-f22a24d82882-kube-api-access-l99n2\") pod \"nmstate-metrics-fdff9cb8d-kgjxr\" (UID: \"84995ef7-937a-442e-a016-f22a24d82882\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kgjxr" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.445336 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9b208785-3928-4bc7-a6fd-1bcee5029917-ovs-socket\") pod \"nmstate-handler-kfbdc\" (UID: \"9b208785-3928-4bc7-a6fd-1bcee5029917\") " pod="openshift-nmstate/nmstate-handler-kfbdc" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.445453 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fe88424e-e46c-46b3-a2e0-7bba5ef147b3-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-z8sr6\" (UID: \"fe88424e-e46c-46b3-a2e0-7bba5ef147b3\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z8sr6" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.445493 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9b208785-3928-4bc7-a6fd-1bcee5029917-nmstate-lock\") pod \"nmstate-handler-kfbdc\" (UID: \"9b208785-3928-4bc7-a6fd-1bcee5029917\") " pod="openshift-nmstate/nmstate-handler-kfbdc" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.445515 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe88424e-e46c-46b3-a2e0-7bba5ef147b3-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-z8sr6\" (UID: \"fe88424e-e46c-46b3-a2e0-7bba5ef147b3\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z8sr6" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.445544 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5twn\" (UniqueName: \"kubernetes.io/projected/9376f046-2ecd-4c20-bf82-4f18490d91d9-kube-api-access-q5twn\") pod \"nmstate-webhook-6cdbc54649-6szkj\" (UID: \"9376f046-2ecd-4c20-bf82-4f18490d91d9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6szkj" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.546406 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2czth\" (UniqueName: \"kubernetes.io/projected/9b208785-3928-4bc7-a6fd-1bcee5029917-kube-api-access-2czth\") pod \"nmstate-handler-kfbdc\" (UID: \"9b208785-3928-4bc7-a6fd-1bcee5029917\") " pod="openshift-nmstate/nmstate-handler-kfbdc" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.546670 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wzcr\" (UniqueName: \"kubernetes.io/projected/fe88424e-e46c-46b3-a2e0-7bba5ef147b3-kube-api-access-2wzcr\") pod \"nmstate-console-plugin-6b874cbd85-z8sr6\" (UID: \"fe88424e-e46c-46b3-a2e0-7bba5ef147b3\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z8sr6" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.546720 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9376f046-2ecd-4c20-bf82-4f18490d91d9-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-6szkj\" (UID: \"9376f046-2ecd-4c20-bf82-4f18490d91d9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6szkj" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.546745 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9b208785-3928-4bc7-a6fd-1bcee5029917-dbus-socket\") pod \"nmstate-handler-kfbdc\" (UID: \"9b208785-3928-4bc7-a6fd-1bcee5029917\") " pod="openshift-nmstate/nmstate-handler-kfbdc" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.546790 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l99n2\" (UniqueName: \"kubernetes.io/projected/84995ef7-937a-442e-a016-f22a24d82882-kube-api-access-l99n2\") pod \"nmstate-metrics-fdff9cb8d-kgjxr\" (UID: \"84995ef7-937a-442e-a016-f22a24d82882\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kgjxr" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.546818 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9b208785-3928-4bc7-a6fd-1bcee5029917-ovs-socket\") pod \"nmstate-handler-kfbdc\" (UID: \"9b208785-3928-4bc7-a6fd-1bcee5029917\") " pod="openshift-nmstate/nmstate-handler-kfbdc" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.546856 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fe88424e-e46c-46b3-a2e0-7bba5ef147b3-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-z8sr6\" (UID: \"fe88424e-e46c-46b3-a2e0-7bba5ef147b3\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z8sr6" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.546876 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9b208785-3928-4bc7-a6fd-1bcee5029917-nmstate-lock\") pod \"nmstate-handler-kfbdc\" (UID: \"9b208785-3928-4bc7-a6fd-1bcee5029917\") " pod="openshift-nmstate/nmstate-handler-kfbdc" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.546892 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe88424e-e46c-46b3-a2e0-7bba5ef147b3-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-z8sr6\" (UID: \"fe88424e-e46c-46b3-a2e0-7bba5ef147b3\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z8sr6" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.546913 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5twn\" (UniqueName: \"kubernetes.io/projected/9376f046-2ecd-4c20-bf82-4f18490d91d9-kube-api-access-q5twn\") pod \"nmstate-webhook-6cdbc54649-6szkj\" (UID: \"9376f046-2ecd-4c20-bf82-4f18490d91d9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6szkj" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.547005 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9b208785-3928-4bc7-a6fd-1bcee5029917-ovs-socket\") pod \"nmstate-handler-kfbdc\" (UID: \"9b208785-3928-4bc7-a6fd-1bcee5029917\") " pod="openshift-nmstate/nmstate-handler-kfbdc" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.547801 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9b208785-3928-4bc7-a6fd-1bcee5029917-nmstate-lock\") pod \"nmstate-handler-kfbdc\" (UID: \"9b208785-3928-4bc7-a6fd-1bcee5029917\") " pod="openshift-nmstate/nmstate-handler-kfbdc" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.548008 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9b208785-3928-4bc7-a6fd-1bcee5029917-dbus-socket\") pod \"nmstate-handler-kfbdc\" (UID: \"9b208785-3928-4bc7-a6fd-1bcee5029917\") " pod="openshift-nmstate/nmstate-handler-kfbdc" Oct 02 06:55:39 crc kubenswrapper[4786]: E1002 06:55:39.548116 4786 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.548252 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fe88424e-e46c-46b3-a2e0-7bba5ef147b3-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-z8sr6\" (UID: \"fe88424e-e46c-46b3-a2e0-7bba5ef147b3\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z8sr6" Oct 02 06:55:39 crc kubenswrapper[4786]: E1002 06:55:39.548262 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe88424e-e46c-46b3-a2e0-7bba5ef147b3-plugin-serving-cert podName:fe88424e-e46c-46b3-a2e0-7bba5ef147b3 nodeName:}" failed. No retries permitted until 2025-10-02 06:55:40.04821968 +0000 UTC m=+550.169402811 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/fe88424e-e46c-46b3-a2e0-7bba5ef147b3-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-z8sr6" (UID: "fe88424e-e46c-46b3-a2e0-7bba5ef147b3") : secret "plugin-serving-cert" not found Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.552351 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9376f046-2ecd-4c20-bf82-4f18490d91d9-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-6szkj\" (UID: \"9376f046-2ecd-4c20-bf82-4f18490d91d9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6szkj" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.559719 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wzcr\" (UniqueName: \"kubernetes.io/projected/fe88424e-e46c-46b3-a2e0-7bba5ef147b3-kube-api-access-2wzcr\") pod \"nmstate-console-plugin-6b874cbd85-z8sr6\" (UID: \"fe88424e-e46c-46b3-a2e0-7bba5ef147b3\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z8sr6" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.564543 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2czth\" (UniqueName: \"kubernetes.io/projected/9b208785-3928-4bc7-a6fd-1bcee5029917-kube-api-access-2czth\") pod \"nmstate-handler-kfbdc\" (UID: \"9b208785-3928-4bc7-a6fd-1bcee5029917\") " pod="openshift-nmstate/nmstate-handler-kfbdc" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.565171 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l99n2\" (UniqueName: \"kubernetes.io/projected/84995ef7-937a-442e-a016-f22a24d82882-kube-api-access-l99n2\") pod \"nmstate-metrics-fdff9cb8d-kgjxr\" (UID: \"84995ef7-937a-442e-a016-f22a24d82882\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kgjxr" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.573060 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5twn\" (UniqueName: \"kubernetes.io/projected/9376f046-2ecd-4c20-bf82-4f18490d91d9-kube-api-access-q5twn\") pod \"nmstate-webhook-6cdbc54649-6szkj\" (UID: \"9376f046-2ecd-4c20-bf82-4f18490d91d9\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6szkj" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.585509 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-779b546c55-jljrh"] Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.586138 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.596180 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-779b546c55-jljrh"] Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.620573 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kgjxr" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.642554 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6szkj" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.648215 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ed1a778-3fb5-4152-a3dd-177555662ace-console-config\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.648246 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ed1a778-3fb5-4152-a3dd-177555662ace-oauth-serving-cert\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.648267 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ed1a778-3fb5-4152-a3dd-177555662ace-service-ca\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.648283 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ed1a778-3fb5-4152-a3dd-177555662ace-trusted-ca-bundle\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.648306 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ed1a778-3fb5-4152-a3dd-177555662ace-console-oauth-config\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.648321 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ed1a778-3fb5-4152-a3dd-177555662ace-console-serving-cert\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.648364 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vmnx\" (UniqueName: \"kubernetes.io/projected/1ed1a778-3fb5-4152-a3dd-177555662ace-kube-api-access-4vmnx\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.648526 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-kfbdc" Oct 02 06:55:39 crc kubenswrapper[4786]: W1002 06:55:39.673869 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b208785_3928_4bc7_a6fd_1bcee5029917.slice/crio-bec973f7e1afa32d4ba5ac192799cc07f259cfe8adfbbbe7a978a31a515a61a8 WatchSource:0}: Error finding container bec973f7e1afa32d4ba5ac192799cc07f259cfe8adfbbbe7a978a31a515a61a8: Status 404 returned error can't find the container with id bec973f7e1afa32d4ba5ac192799cc07f259cfe8adfbbbe7a978a31a515a61a8 Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.749611 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ed1a778-3fb5-4152-a3dd-177555662ace-trusted-ca-bundle\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.749932 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ed1a778-3fb5-4152-a3dd-177555662ace-console-oauth-config\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.749956 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ed1a778-3fb5-4152-a3dd-177555662ace-console-serving-cert\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.750056 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vmnx\" (UniqueName: \"kubernetes.io/projected/1ed1a778-3fb5-4152-a3dd-177555662ace-kube-api-access-4vmnx\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.750132 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ed1a778-3fb5-4152-a3dd-177555662ace-console-config\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.750154 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ed1a778-3fb5-4152-a3dd-177555662ace-oauth-serving-cert\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.750169 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ed1a778-3fb5-4152-a3dd-177555662ace-service-ca\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.751070 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ed1a778-3fb5-4152-a3dd-177555662ace-service-ca\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.751118 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ed1a778-3fb5-4152-a3dd-177555662ace-trusted-ca-bundle\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.751133 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ed1a778-3fb5-4152-a3dd-177555662ace-console-config\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.751195 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ed1a778-3fb5-4152-a3dd-177555662ace-oauth-serving-cert\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.752805 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ed1a778-3fb5-4152-a3dd-177555662ace-console-serving-cert\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.754969 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ed1a778-3fb5-4152-a3dd-177555662ace-console-oauth-config\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.765133 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vmnx\" (UniqueName: \"kubernetes.io/projected/1ed1a778-3fb5-4152-a3dd-177555662ace-kube-api-access-4vmnx\") pod \"console-779b546c55-jljrh\" (UID: \"1ed1a778-3fb5-4152-a3dd-177555662ace\") " pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.912071 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:39 crc kubenswrapper[4786]: I1002 06:55:39.995321 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-6szkj"] Oct 02 06:55:40 crc kubenswrapper[4786]: I1002 06:55:40.029023 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-kgjxr"] Oct 02 06:55:40 crc kubenswrapper[4786]: W1002 06:55:40.033169 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84995ef7_937a_442e_a016_f22a24d82882.slice/crio-05f47dcf6c10c46aec462c18a37f07675f4f7022d22422b3c57282f72f2792bf WatchSource:0}: Error finding container 05f47dcf6c10c46aec462c18a37f07675f4f7022d22422b3c57282f72f2792bf: Status 404 returned error can't find the container with id 05f47dcf6c10c46aec462c18a37f07675f4f7022d22422b3c57282f72f2792bf Oct 02 06:55:40 crc kubenswrapper[4786]: I1002 06:55:40.048686 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-779b546c55-jljrh"] Oct 02 06:55:40 crc kubenswrapper[4786]: W1002 06:55:40.051092 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ed1a778_3fb5_4152_a3dd_177555662ace.slice/crio-1d162852c7ec7f7daf4b91c9f720abb03e7de3401045474960f2801911d4ad82 WatchSource:0}: Error finding container 1d162852c7ec7f7daf4b91c9f720abb03e7de3401045474960f2801911d4ad82: Status 404 returned error can't find the container with id 1d162852c7ec7f7daf4b91c9f720abb03e7de3401045474960f2801911d4ad82 Oct 02 06:55:40 crc kubenswrapper[4786]: I1002 06:55:40.052527 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe88424e-e46c-46b3-a2e0-7bba5ef147b3-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-z8sr6\" (UID: \"fe88424e-e46c-46b3-a2e0-7bba5ef147b3\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z8sr6" Oct 02 06:55:40 crc kubenswrapper[4786]: I1002 06:55:40.055532 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe88424e-e46c-46b3-a2e0-7bba5ef147b3-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-z8sr6\" (UID: \"fe88424e-e46c-46b3-a2e0-7bba5ef147b3\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z8sr6" Oct 02 06:55:40 crc kubenswrapper[4786]: I1002 06:55:40.323007 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z8sr6" Oct 02 06:55:40 crc kubenswrapper[4786]: I1002 06:55:40.451019 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-z8sr6"] Oct 02 06:55:40 crc kubenswrapper[4786]: W1002 06:55:40.455747 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe88424e_e46c_46b3_a2e0_7bba5ef147b3.slice/crio-e48cbf20cc6e341a7f814a28c039a5c2c10c24371c4d1978e539c63fb38f0c2b WatchSource:0}: Error finding container e48cbf20cc6e341a7f814a28c039a5c2c10c24371c4d1978e539c63fb38f0c2b: Status 404 returned error can't find the container with id e48cbf20cc6e341a7f814a28c039a5c2c10c24371c4d1978e539c63fb38f0c2b Oct 02 06:55:40 crc kubenswrapper[4786]: I1002 06:55:40.505823 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-779b546c55-jljrh" event={"ID":"1ed1a778-3fb5-4152-a3dd-177555662ace","Type":"ContainerStarted","Data":"0993eaa96742a69da261b5932bb47d2a4127a8c421c332136e04b42aae122508"} Oct 02 06:55:40 crc kubenswrapper[4786]: I1002 06:55:40.505873 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-779b546c55-jljrh" event={"ID":"1ed1a778-3fb5-4152-a3dd-177555662ace","Type":"ContainerStarted","Data":"1d162852c7ec7f7daf4b91c9f720abb03e7de3401045474960f2801911d4ad82"} Oct 02 06:55:40 crc kubenswrapper[4786]: I1002 06:55:40.507229 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z8sr6" event={"ID":"fe88424e-e46c-46b3-a2e0-7bba5ef147b3","Type":"ContainerStarted","Data":"e48cbf20cc6e341a7f814a28c039a5c2c10c24371c4d1978e539c63fb38f0c2b"} Oct 02 06:55:40 crc kubenswrapper[4786]: I1002 06:55:40.508883 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kgjxr" event={"ID":"84995ef7-937a-442e-a016-f22a24d82882","Type":"ContainerStarted","Data":"05f47dcf6c10c46aec462c18a37f07675f4f7022d22422b3c57282f72f2792bf"} Oct 02 06:55:40 crc kubenswrapper[4786]: I1002 06:55:40.509952 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-kfbdc" event={"ID":"9b208785-3928-4bc7-a6fd-1bcee5029917","Type":"ContainerStarted","Data":"bec973f7e1afa32d4ba5ac192799cc07f259cfe8adfbbbe7a978a31a515a61a8"} Oct 02 06:55:40 crc kubenswrapper[4786]: I1002 06:55:40.510719 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6szkj" event={"ID":"9376f046-2ecd-4c20-bf82-4f18490d91d9","Type":"ContainerStarted","Data":"c6d22f5c39a6040ba6c742e188c2c31141f4ea4a7088d056b130f04165170e58"} Oct 02 06:55:40 crc kubenswrapper[4786]: I1002 06:55:40.523112 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-779b546c55-jljrh" podStartSLOduration=1.5230892329999999 podStartE2EDuration="1.523089233s" podCreationTimestamp="2025-10-02 06:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:55:40.517988322 +0000 UTC m=+550.639171463" watchObservedRunningTime="2025-10-02 06:55:40.523089233 +0000 UTC m=+550.644272364" Oct 02 06:55:42 crc kubenswrapper[4786]: I1002 06:55:42.524873 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kgjxr" event={"ID":"84995ef7-937a-442e-a016-f22a24d82882","Type":"ContainerStarted","Data":"1609c51159064cd5584eca83de851afa27adb3859615e3216a67ea558bf9ef08"} Oct 02 06:55:42 crc kubenswrapper[4786]: I1002 06:55:42.527510 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6szkj" event={"ID":"9376f046-2ecd-4c20-bf82-4f18490d91d9","Type":"ContainerStarted","Data":"ed097e3d0a040a75b13e19ea7b477650d8e73aad0a5b249d9b05ad9a460260cd"} Oct 02 06:55:42 crc kubenswrapper[4786]: I1002 06:55:42.527612 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6szkj" Oct 02 06:55:42 crc kubenswrapper[4786]: I1002 06:55:42.530566 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z8sr6" event={"ID":"fe88424e-e46c-46b3-a2e0-7bba5ef147b3","Type":"ContainerStarted","Data":"3c4c6006eba421f6583daff709e37aa061aaa1a922b427015db518bd755bd807"} Oct 02 06:55:42 crc kubenswrapper[4786]: I1002 06:55:42.541967 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6szkj" podStartSLOduration=1.178712807 podStartE2EDuration="3.541951503s" podCreationTimestamp="2025-10-02 06:55:39 +0000 UTC" firstStartedPulling="2025-10-02 06:55:40.002858712 +0000 UTC m=+550.124041844" lastFinishedPulling="2025-10-02 06:55:42.366097408 +0000 UTC m=+552.487280540" observedRunningTime="2025-10-02 06:55:42.540109335 +0000 UTC m=+552.661292476" watchObservedRunningTime="2025-10-02 06:55:42.541951503 +0000 UTC m=+552.663134635" Oct 02 06:55:42 crc kubenswrapper[4786]: I1002 06:55:42.552974 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-z8sr6" podStartSLOduration=1.634589525 podStartE2EDuration="3.552960668s" podCreationTimestamp="2025-10-02 06:55:39 +0000 UTC" firstStartedPulling="2025-10-02 06:55:40.457859068 +0000 UTC m=+550.579042198" lastFinishedPulling="2025-10-02 06:55:42.37623021 +0000 UTC m=+552.497413341" observedRunningTime="2025-10-02 06:55:42.551811649 +0000 UTC m=+552.672994800" watchObservedRunningTime="2025-10-02 06:55:42.552960668 +0000 UTC m=+552.674143799" Oct 02 06:55:43 crc kubenswrapper[4786]: I1002 06:55:43.538446 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-kfbdc" event={"ID":"9b208785-3928-4bc7-a6fd-1bcee5029917","Type":"ContainerStarted","Data":"da5bc2323e253990382f0995468cb032cb3e2c5cf2cad9d962487943375370a7"} Oct 02 06:55:43 crc kubenswrapper[4786]: I1002 06:55:43.550759 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-kfbdc" podStartSLOduration=1.883267863 podStartE2EDuration="4.550740355s" podCreationTimestamp="2025-10-02 06:55:39 +0000 UTC" firstStartedPulling="2025-10-02 06:55:39.676017524 +0000 UTC m=+549.797200655" lastFinishedPulling="2025-10-02 06:55:42.343490015 +0000 UTC m=+552.464673147" observedRunningTime="2025-10-02 06:55:43.548854745 +0000 UTC m=+553.670037886" watchObservedRunningTime="2025-10-02 06:55:43.550740355 +0000 UTC m=+553.671923487" Oct 02 06:55:44 crc kubenswrapper[4786]: I1002 06:55:44.545341 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kgjxr" event={"ID":"84995ef7-937a-442e-a016-f22a24d82882","Type":"ContainerStarted","Data":"171c7c7763fb9feaecb32568002024801b8b425a08a9186d01c8b16a33a4c512"} Oct 02 06:55:44 crc kubenswrapper[4786]: I1002 06:55:44.545615 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-kfbdc" Oct 02 06:55:44 crc kubenswrapper[4786]: I1002 06:55:44.557977 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kgjxr" podStartSLOduration=1.238886535 podStartE2EDuration="5.557957741s" podCreationTimestamp="2025-10-02 06:55:39 +0000 UTC" firstStartedPulling="2025-10-02 06:55:40.037970253 +0000 UTC m=+550.159153384" lastFinishedPulling="2025-10-02 06:55:44.357041459 +0000 UTC m=+554.478224590" observedRunningTime="2025-10-02 06:55:44.555498919 +0000 UTC m=+554.676682060" watchObservedRunningTime="2025-10-02 06:55:44.557957741 +0000 UTC m=+554.679140873" Oct 02 06:55:49 crc kubenswrapper[4786]: I1002 06:55:49.668083 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-kfbdc" Oct 02 06:55:49 crc kubenswrapper[4786]: I1002 06:55:49.912364 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:49 crc kubenswrapper[4786]: I1002 06:55:49.912423 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:49 crc kubenswrapper[4786]: I1002 06:55:49.916369 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:50 crc kubenswrapper[4786]: I1002 06:55:50.575807 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-779b546c55-jljrh" Oct 02 06:55:50 crc kubenswrapper[4786]: I1002 06:55:50.611934 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ph6hx"] Oct 02 06:55:57 crc kubenswrapper[4786]: I1002 06:55:57.498090 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 06:55:57 crc kubenswrapper[4786]: I1002 06:55:57.498540 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 06:55:57 crc kubenswrapper[4786]: I1002 06:55:57.498593 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 06:55:57 crc kubenswrapper[4786]: I1002 06:55:57.499090 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6b5bd4f7ef853564be38c5c22d2f88f290b16d0915727305dc89bbed1ec9a81c"} pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 06:55:57 crc kubenswrapper[4786]: I1002 06:55:57.499152 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" containerID="cri-o://6b5bd4f7ef853564be38c5c22d2f88f290b16d0915727305dc89bbed1ec9a81c" gracePeriod=600 Oct 02 06:55:58 crc kubenswrapper[4786]: I1002 06:55:58.611171 4786 generic.go:334] "Generic (PLEG): container finished" podID="79cb22df-4930-4aed-9108-1056074d1000" containerID="6b5bd4f7ef853564be38c5c22d2f88f290b16d0915727305dc89bbed1ec9a81c" exitCode=0 Oct 02 06:55:58 crc kubenswrapper[4786]: I1002 06:55:58.611235 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerDied","Data":"6b5bd4f7ef853564be38c5c22d2f88f290b16d0915727305dc89bbed1ec9a81c"} Oct 02 06:55:58 crc kubenswrapper[4786]: I1002 06:55:58.611591 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerStarted","Data":"613104429ff7e56e4d88582bf64ddcf8603f93d2b0b8b15a934f56112fabd10d"} Oct 02 06:55:58 crc kubenswrapper[4786]: I1002 06:55:58.611619 4786 scope.go:117] "RemoveContainer" containerID="2c64abc9152933569ee60a2038ba082fca146fe30b68dc264add9f4e59c75ef2" Oct 02 06:55:59 crc kubenswrapper[4786]: I1002 06:55:59.652750 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-6szkj" Oct 02 06:56:08 crc kubenswrapper[4786]: I1002 06:56:08.728619 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf"] Oct 02 06:56:08 crc kubenswrapper[4786]: I1002 06:56:08.730133 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf" Oct 02 06:56:08 crc kubenswrapper[4786]: I1002 06:56:08.731567 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 06:56:08 crc kubenswrapper[4786]: I1002 06:56:08.759080 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bee82fb-73b9-40ee-800d-9b85d84324d6-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf\" (UID: \"1bee82fb-73b9-40ee-800d-9b85d84324d6\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf" Oct 02 06:56:08 crc kubenswrapper[4786]: I1002 06:56:08.759170 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bee82fb-73b9-40ee-800d-9b85d84324d6-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf\" (UID: \"1bee82fb-73b9-40ee-800d-9b85d84324d6\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf" Oct 02 06:56:08 crc kubenswrapper[4786]: I1002 06:56:08.759257 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd4cr\" (UniqueName: \"kubernetes.io/projected/1bee82fb-73b9-40ee-800d-9b85d84324d6-kube-api-access-kd4cr\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf\" (UID: \"1bee82fb-73b9-40ee-800d-9b85d84324d6\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf" Oct 02 06:56:08 crc kubenswrapper[4786]: I1002 06:56:08.769971 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf"] Oct 02 06:56:08 crc kubenswrapper[4786]: I1002 06:56:08.860483 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bee82fb-73b9-40ee-800d-9b85d84324d6-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf\" (UID: \"1bee82fb-73b9-40ee-800d-9b85d84324d6\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf" Oct 02 06:56:08 crc kubenswrapper[4786]: I1002 06:56:08.860543 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bee82fb-73b9-40ee-800d-9b85d84324d6-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf\" (UID: \"1bee82fb-73b9-40ee-800d-9b85d84324d6\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf" Oct 02 06:56:08 crc kubenswrapper[4786]: I1002 06:56:08.860607 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd4cr\" (UniqueName: \"kubernetes.io/projected/1bee82fb-73b9-40ee-800d-9b85d84324d6-kube-api-access-kd4cr\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf\" (UID: \"1bee82fb-73b9-40ee-800d-9b85d84324d6\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf" Oct 02 06:56:08 crc kubenswrapper[4786]: I1002 06:56:08.861679 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bee82fb-73b9-40ee-800d-9b85d84324d6-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf\" (UID: \"1bee82fb-73b9-40ee-800d-9b85d84324d6\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf" Oct 02 06:56:08 crc kubenswrapper[4786]: I1002 06:56:08.861758 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bee82fb-73b9-40ee-800d-9b85d84324d6-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf\" (UID: \"1bee82fb-73b9-40ee-800d-9b85d84324d6\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf" Oct 02 06:56:08 crc kubenswrapper[4786]: I1002 06:56:08.876263 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd4cr\" (UniqueName: \"kubernetes.io/projected/1bee82fb-73b9-40ee-800d-9b85d84324d6-kube-api-access-kd4cr\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf\" (UID: \"1bee82fb-73b9-40ee-800d-9b85d84324d6\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf" Oct 02 06:56:09 crc kubenswrapper[4786]: I1002 06:56:09.044803 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf" Oct 02 06:56:09 crc kubenswrapper[4786]: I1002 06:56:09.389350 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf"] Oct 02 06:56:09 crc kubenswrapper[4786]: I1002 06:56:09.668060 4786 generic.go:334] "Generic (PLEG): container finished" podID="1bee82fb-73b9-40ee-800d-9b85d84324d6" containerID="29ed6b782d94db5714a860cb364e236128006f5affb7334848080af28b29a6fe" exitCode=0 Oct 02 06:56:09 crc kubenswrapper[4786]: I1002 06:56:09.668181 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf" event={"ID":"1bee82fb-73b9-40ee-800d-9b85d84324d6","Type":"ContainerDied","Data":"29ed6b782d94db5714a860cb364e236128006f5affb7334848080af28b29a6fe"} Oct 02 06:56:09 crc kubenswrapper[4786]: I1002 06:56:09.668325 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf" event={"ID":"1bee82fb-73b9-40ee-800d-9b85d84324d6","Type":"ContainerStarted","Data":"5403a0df07ae670476980344cd7c161e4c411dc22dd70d7d22faf210d529ca73"} Oct 02 06:56:11 crc kubenswrapper[4786]: I1002 06:56:11.679216 4786 generic.go:334] "Generic (PLEG): container finished" podID="1bee82fb-73b9-40ee-800d-9b85d84324d6" containerID="5a96d7918b24c18e6b8c672104595ba81930ee04dacc2be3ba1345c3dc858fb1" exitCode=0 Oct 02 06:56:11 crc kubenswrapper[4786]: I1002 06:56:11.679309 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf" event={"ID":"1bee82fb-73b9-40ee-800d-9b85d84324d6","Type":"ContainerDied","Data":"5a96d7918b24c18e6b8c672104595ba81930ee04dacc2be3ba1345c3dc858fb1"} Oct 02 06:56:12 crc kubenswrapper[4786]: I1002 06:56:12.695241 4786 generic.go:334] "Generic (PLEG): container finished" podID="1bee82fb-73b9-40ee-800d-9b85d84324d6" containerID="c78ee3df0e195fb97a6b5a7f83a8771a99f4676d89e903a9b981535aa8e86414" exitCode=0 Oct 02 06:56:12 crc kubenswrapper[4786]: I1002 06:56:12.695355 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf" event={"ID":"1bee82fb-73b9-40ee-800d-9b85d84324d6","Type":"ContainerDied","Data":"c78ee3df0e195fb97a6b5a7f83a8771a99f4676d89e903a9b981535aa8e86414"} Oct 02 06:56:13 crc kubenswrapper[4786]: I1002 06:56:13.892508 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf" Oct 02 06:56:14 crc kubenswrapper[4786]: I1002 06:56:14.010015 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd4cr\" (UniqueName: \"kubernetes.io/projected/1bee82fb-73b9-40ee-800d-9b85d84324d6-kube-api-access-kd4cr\") pod \"1bee82fb-73b9-40ee-800d-9b85d84324d6\" (UID: \"1bee82fb-73b9-40ee-800d-9b85d84324d6\") " Oct 02 06:56:14 crc kubenswrapper[4786]: I1002 06:56:14.010149 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bee82fb-73b9-40ee-800d-9b85d84324d6-bundle\") pod \"1bee82fb-73b9-40ee-800d-9b85d84324d6\" (UID: \"1bee82fb-73b9-40ee-800d-9b85d84324d6\") " Oct 02 06:56:14 crc kubenswrapper[4786]: I1002 06:56:14.010190 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bee82fb-73b9-40ee-800d-9b85d84324d6-util\") pod \"1bee82fb-73b9-40ee-800d-9b85d84324d6\" (UID: \"1bee82fb-73b9-40ee-800d-9b85d84324d6\") " Oct 02 06:56:14 crc kubenswrapper[4786]: I1002 06:56:14.011177 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bee82fb-73b9-40ee-800d-9b85d84324d6-bundle" (OuterVolumeSpecName: "bundle") pod "1bee82fb-73b9-40ee-800d-9b85d84324d6" (UID: "1bee82fb-73b9-40ee-800d-9b85d84324d6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:56:14 crc kubenswrapper[4786]: I1002 06:56:14.011308 4786 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bee82fb-73b9-40ee-800d-9b85d84324d6-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 06:56:14 crc kubenswrapper[4786]: I1002 06:56:14.014443 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bee82fb-73b9-40ee-800d-9b85d84324d6-kube-api-access-kd4cr" (OuterVolumeSpecName: "kube-api-access-kd4cr") pod "1bee82fb-73b9-40ee-800d-9b85d84324d6" (UID: "1bee82fb-73b9-40ee-800d-9b85d84324d6"). InnerVolumeSpecName "kube-api-access-kd4cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:56:14 crc kubenswrapper[4786]: I1002 06:56:14.020361 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bee82fb-73b9-40ee-800d-9b85d84324d6-util" (OuterVolumeSpecName: "util") pod "1bee82fb-73b9-40ee-800d-9b85d84324d6" (UID: "1bee82fb-73b9-40ee-800d-9b85d84324d6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:56:14 crc kubenswrapper[4786]: I1002 06:56:14.112246 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd4cr\" (UniqueName: \"kubernetes.io/projected/1bee82fb-73b9-40ee-800d-9b85d84324d6-kube-api-access-kd4cr\") on node \"crc\" DevicePath \"\"" Oct 02 06:56:14 crc kubenswrapper[4786]: I1002 06:56:14.112273 4786 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bee82fb-73b9-40ee-800d-9b85d84324d6-util\") on node \"crc\" DevicePath \"\"" Oct 02 06:56:14 crc kubenswrapper[4786]: I1002 06:56:14.705743 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf" event={"ID":"1bee82fb-73b9-40ee-800d-9b85d84324d6","Type":"ContainerDied","Data":"5403a0df07ae670476980344cd7c161e4c411dc22dd70d7d22faf210d529ca73"} Oct 02 06:56:14 crc kubenswrapper[4786]: I1002 06:56:14.705776 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5403a0df07ae670476980344cd7c161e4c411dc22dd70d7d22faf210d529ca73" Oct 02 06:56:14 crc kubenswrapper[4786]: I1002 06:56:14.705807 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf" Oct 02 06:56:15 crc kubenswrapper[4786]: I1002 06:56:15.637990 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-ph6hx" podUID="ca2df496-2124-4881-ae15-4fa5a1a4f0ea" containerName="console" containerID="cri-o://16985e466dcbd2d14ae593961e39a1e2b28faf3233583a99279b870e72c058b5" gracePeriod=15 Oct 02 06:56:15 crc kubenswrapper[4786]: I1002 06:56:15.951539 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ph6hx_ca2df496-2124-4881-ae15-4fa5a1a4f0ea/console/0.log" Oct 02 06:56:15 crc kubenswrapper[4786]: I1002 06:56:15.951821 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.031098 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-console-serving-cert\") pod \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.031145 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nfwp\" (UniqueName: \"kubernetes.io/projected/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-kube-api-access-7nfwp\") pod \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.031193 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-console-config\") pod \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.031229 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-trusted-ca-bundle\") pod \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.031274 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-service-ca\") pod \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.032313 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-console-config" (OuterVolumeSpecName: "console-config") pod "ca2df496-2124-4881-ae15-4fa5a1a4f0ea" (UID: "ca2df496-2124-4881-ae15-4fa5a1a4f0ea"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.032335 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ca2df496-2124-4881-ae15-4fa5a1a4f0ea" (UID: "ca2df496-2124-4881-ae15-4fa5a1a4f0ea"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.032372 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-service-ca" (OuterVolumeSpecName: "service-ca") pod "ca2df496-2124-4881-ae15-4fa5a1a4f0ea" (UID: "ca2df496-2124-4881-ae15-4fa5a1a4f0ea"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.036951 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ca2df496-2124-4881-ae15-4fa5a1a4f0ea" (UID: "ca2df496-2124-4881-ae15-4fa5a1a4f0ea"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.037003 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-kube-api-access-7nfwp" (OuterVolumeSpecName: "kube-api-access-7nfwp") pod "ca2df496-2124-4881-ae15-4fa5a1a4f0ea" (UID: "ca2df496-2124-4881-ae15-4fa5a1a4f0ea"). InnerVolumeSpecName "kube-api-access-7nfwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.132430 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-oauth-serving-cert\") pod \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.132491 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-console-oauth-config\") pod \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\" (UID: \"ca2df496-2124-4881-ae15-4fa5a1a4f0ea\") " Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.132858 4786 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.132881 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nfwp\" (UniqueName: \"kubernetes.io/projected/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-kube-api-access-7nfwp\") on node \"crc\" DevicePath \"\"" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.132891 4786 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-console-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.132899 4786 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.132907 4786 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.132951 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ca2df496-2124-4881-ae15-4fa5a1a4f0ea" (UID: "ca2df496-2124-4881-ae15-4fa5a1a4f0ea"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.135386 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ca2df496-2124-4881-ae15-4fa5a1a4f0ea" (UID: "ca2df496-2124-4881-ae15-4fa5a1a4f0ea"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.234183 4786 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.234661 4786 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ca2df496-2124-4881-ae15-4fa5a1a4f0ea-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.716338 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ph6hx_ca2df496-2124-4881-ae15-4fa5a1a4f0ea/console/0.log" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.716395 4786 generic.go:334] "Generic (PLEG): container finished" podID="ca2df496-2124-4881-ae15-4fa5a1a4f0ea" containerID="16985e466dcbd2d14ae593961e39a1e2b28faf3233583a99279b870e72c058b5" exitCode=2 Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.716434 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ph6hx" event={"ID":"ca2df496-2124-4881-ae15-4fa5a1a4f0ea","Type":"ContainerDied","Data":"16985e466dcbd2d14ae593961e39a1e2b28faf3233583a99279b870e72c058b5"} Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.716463 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ph6hx" event={"ID":"ca2df496-2124-4881-ae15-4fa5a1a4f0ea","Type":"ContainerDied","Data":"a46a57ecbcd87e3a54ddd2b1c4300ed448c1c11596fff619ed978cae544b5fb8"} Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.716481 4786 scope.go:117] "RemoveContainer" containerID="16985e466dcbd2d14ae593961e39a1e2b28faf3233583a99279b870e72c058b5" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.716484 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ph6hx" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.731168 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ph6hx"] Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.731821 4786 scope.go:117] "RemoveContainer" containerID="16985e466dcbd2d14ae593961e39a1e2b28faf3233583a99279b870e72c058b5" Oct 02 06:56:16 crc kubenswrapper[4786]: E1002 06:56:16.732283 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16985e466dcbd2d14ae593961e39a1e2b28faf3233583a99279b870e72c058b5\": container with ID starting with 16985e466dcbd2d14ae593961e39a1e2b28faf3233583a99279b870e72c058b5 not found: ID does not exist" containerID="16985e466dcbd2d14ae593961e39a1e2b28faf3233583a99279b870e72c058b5" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.732327 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16985e466dcbd2d14ae593961e39a1e2b28faf3233583a99279b870e72c058b5"} err="failed to get container status \"16985e466dcbd2d14ae593961e39a1e2b28faf3233583a99279b870e72c058b5\": rpc error: code = NotFound desc = could not find container \"16985e466dcbd2d14ae593961e39a1e2b28faf3233583a99279b870e72c058b5\": container with ID starting with 16985e466dcbd2d14ae593961e39a1e2b28faf3233583a99279b870e72c058b5 not found: ID does not exist" Oct 02 06:56:16 crc kubenswrapper[4786]: I1002 06:56:16.733160 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-ph6hx"] Oct 02 06:56:18 crc kubenswrapper[4786]: I1002 06:56:18.185015 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca2df496-2124-4881-ae15-4fa5a1a4f0ea" path="/var/lib/kubelet/pods/ca2df496-2124-4881-ae15-4fa5a1a4f0ea/volumes" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.160425 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-85846c6c54-9wx2x"] Oct 02 06:56:23 crc kubenswrapper[4786]: E1002 06:56:23.160800 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bee82fb-73b9-40ee-800d-9b85d84324d6" containerName="util" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.160812 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bee82fb-73b9-40ee-800d-9b85d84324d6" containerName="util" Oct 02 06:56:23 crc kubenswrapper[4786]: E1002 06:56:23.160822 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bee82fb-73b9-40ee-800d-9b85d84324d6" containerName="extract" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.160828 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bee82fb-73b9-40ee-800d-9b85d84324d6" containerName="extract" Oct 02 06:56:23 crc kubenswrapper[4786]: E1002 06:56:23.160845 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bee82fb-73b9-40ee-800d-9b85d84324d6" containerName="pull" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.160850 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bee82fb-73b9-40ee-800d-9b85d84324d6" containerName="pull" Oct 02 06:56:23 crc kubenswrapper[4786]: E1002 06:56:23.160858 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca2df496-2124-4881-ae15-4fa5a1a4f0ea" containerName="console" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.160862 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca2df496-2124-4881-ae15-4fa5a1a4f0ea" containerName="console" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.160974 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bee82fb-73b9-40ee-800d-9b85d84324d6" containerName="extract" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.160986 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca2df496-2124-4881-ae15-4fa5a1a4f0ea" containerName="console" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.161308 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-85846c6c54-9wx2x" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.162465 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.163587 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.163820 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-44bb4" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.164121 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.164600 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.202488 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-85846c6c54-9wx2x"] Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.215023 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/42b1054f-e5f7-4d21-a4a8-98bcb85946c5-apiservice-cert\") pod \"metallb-operator-controller-manager-85846c6c54-9wx2x\" (UID: \"42b1054f-e5f7-4d21-a4a8-98bcb85946c5\") " pod="metallb-system/metallb-operator-controller-manager-85846c6c54-9wx2x" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.215101 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/42b1054f-e5f7-4d21-a4a8-98bcb85946c5-webhook-cert\") pod \"metallb-operator-controller-manager-85846c6c54-9wx2x\" (UID: \"42b1054f-e5f7-4d21-a4a8-98bcb85946c5\") " pod="metallb-system/metallb-operator-controller-manager-85846c6c54-9wx2x" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.215122 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6pd8\" (UniqueName: \"kubernetes.io/projected/42b1054f-e5f7-4d21-a4a8-98bcb85946c5-kube-api-access-g6pd8\") pod \"metallb-operator-controller-manager-85846c6c54-9wx2x\" (UID: \"42b1054f-e5f7-4d21-a4a8-98bcb85946c5\") " pod="metallb-system/metallb-operator-controller-manager-85846c6c54-9wx2x" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.316454 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/42b1054f-e5f7-4d21-a4a8-98bcb85946c5-webhook-cert\") pod \"metallb-operator-controller-manager-85846c6c54-9wx2x\" (UID: \"42b1054f-e5f7-4d21-a4a8-98bcb85946c5\") " pod="metallb-system/metallb-operator-controller-manager-85846c6c54-9wx2x" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.316721 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6pd8\" (UniqueName: \"kubernetes.io/projected/42b1054f-e5f7-4d21-a4a8-98bcb85946c5-kube-api-access-g6pd8\") pod \"metallb-operator-controller-manager-85846c6c54-9wx2x\" (UID: \"42b1054f-e5f7-4d21-a4a8-98bcb85946c5\") " pod="metallb-system/metallb-operator-controller-manager-85846c6c54-9wx2x" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.316787 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/42b1054f-e5f7-4d21-a4a8-98bcb85946c5-apiservice-cert\") pod \"metallb-operator-controller-manager-85846c6c54-9wx2x\" (UID: \"42b1054f-e5f7-4d21-a4a8-98bcb85946c5\") " pod="metallb-system/metallb-operator-controller-manager-85846c6c54-9wx2x" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.321540 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/42b1054f-e5f7-4d21-a4a8-98bcb85946c5-apiservice-cert\") pod \"metallb-operator-controller-manager-85846c6c54-9wx2x\" (UID: \"42b1054f-e5f7-4d21-a4a8-98bcb85946c5\") " pod="metallb-system/metallb-operator-controller-manager-85846c6c54-9wx2x" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.328360 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/42b1054f-e5f7-4d21-a4a8-98bcb85946c5-webhook-cert\") pod \"metallb-operator-controller-manager-85846c6c54-9wx2x\" (UID: \"42b1054f-e5f7-4d21-a4a8-98bcb85946c5\") " pod="metallb-system/metallb-operator-controller-manager-85846c6c54-9wx2x" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.332429 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6pd8\" (UniqueName: \"kubernetes.io/projected/42b1054f-e5f7-4d21-a4a8-98bcb85946c5-kube-api-access-g6pd8\") pod \"metallb-operator-controller-manager-85846c6c54-9wx2x\" (UID: \"42b1054f-e5f7-4d21-a4a8-98bcb85946c5\") " pod="metallb-system/metallb-operator-controller-manager-85846c6c54-9wx2x" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.473235 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-85846c6c54-9wx2x" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.503522 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6765d85898-4g7pf"] Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.504338 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6765d85898-4g7pf" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.506460 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.520279 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b29b896b-61fb-40e7-80ce-d87fc031e3ae-webhook-cert\") pod \"metallb-operator-webhook-server-6765d85898-4g7pf\" (UID: \"b29b896b-61fb-40e7-80ce-d87fc031e3ae\") " pod="metallb-system/metallb-operator-webhook-server-6765d85898-4g7pf" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.520375 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b29b896b-61fb-40e7-80ce-d87fc031e3ae-apiservice-cert\") pod \"metallb-operator-webhook-server-6765d85898-4g7pf\" (UID: \"b29b896b-61fb-40e7-80ce-d87fc031e3ae\") " pod="metallb-system/metallb-operator-webhook-server-6765d85898-4g7pf" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.520406 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5cdh\" (UniqueName: \"kubernetes.io/projected/b29b896b-61fb-40e7-80ce-d87fc031e3ae-kube-api-access-m5cdh\") pod \"metallb-operator-webhook-server-6765d85898-4g7pf\" (UID: \"b29b896b-61fb-40e7-80ce-d87fc031e3ae\") " pod="metallb-system/metallb-operator-webhook-server-6765d85898-4g7pf" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.520617 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.523804 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6765d85898-4g7pf"] Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.523968 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-bpxnk" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.621259 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b29b896b-61fb-40e7-80ce-d87fc031e3ae-apiservice-cert\") pod \"metallb-operator-webhook-server-6765d85898-4g7pf\" (UID: \"b29b896b-61fb-40e7-80ce-d87fc031e3ae\") " pod="metallb-system/metallb-operator-webhook-server-6765d85898-4g7pf" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.621493 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5cdh\" (UniqueName: \"kubernetes.io/projected/b29b896b-61fb-40e7-80ce-d87fc031e3ae-kube-api-access-m5cdh\") pod \"metallb-operator-webhook-server-6765d85898-4g7pf\" (UID: \"b29b896b-61fb-40e7-80ce-d87fc031e3ae\") " pod="metallb-system/metallb-operator-webhook-server-6765d85898-4g7pf" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.621563 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b29b896b-61fb-40e7-80ce-d87fc031e3ae-webhook-cert\") pod \"metallb-operator-webhook-server-6765d85898-4g7pf\" (UID: \"b29b896b-61fb-40e7-80ce-d87fc031e3ae\") " pod="metallb-system/metallb-operator-webhook-server-6765d85898-4g7pf" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.627443 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b29b896b-61fb-40e7-80ce-d87fc031e3ae-webhook-cert\") pod \"metallb-operator-webhook-server-6765d85898-4g7pf\" (UID: \"b29b896b-61fb-40e7-80ce-d87fc031e3ae\") " pod="metallb-system/metallb-operator-webhook-server-6765d85898-4g7pf" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.627453 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b29b896b-61fb-40e7-80ce-d87fc031e3ae-apiservice-cert\") pod \"metallb-operator-webhook-server-6765d85898-4g7pf\" (UID: \"b29b896b-61fb-40e7-80ce-d87fc031e3ae\") " pod="metallb-system/metallb-operator-webhook-server-6765d85898-4g7pf" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.638783 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5cdh\" (UniqueName: \"kubernetes.io/projected/b29b896b-61fb-40e7-80ce-d87fc031e3ae-kube-api-access-m5cdh\") pod \"metallb-operator-webhook-server-6765d85898-4g7pf\" (UID: \"b29b896b-61fb-40e7-80ce-d87fc031e3ae\") " pod="metallb-system/metallb-operator-webhook-server-6765d85898-4g7pf" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.823956 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6765d85898-4g7pf" Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.920320 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-85846c6c54-9wx2x"] Oct 02 06:56:23 crc kubenswrapper[4786]: W1002 06:56:23.926272 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42b1054f_e5f7_4d21_a4a8_98bcb85946c5.slice/crio-d83e8a2994b6a86e2e98a6f1e2b2d22e9ee006c5efc1df9978cbf890a14aea61 WatchSource:0}: Error finding container d83e8a2994b6a86e2e98a6f1e2b2d22e9ee006c5efc1df9978cbf890a14aea61: Status 404 returned error can't find the container with id d83e8a2994b6a86e2e98a6f1e2b2d22e9ee006c5efc1df9978cbf890a14aea61 Oct 02 06:56:23 crc kubenswrapper[4786]: I1002 06:56:23.988832 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6765d85898-4g7pf"] Oct 02 06:56:23 crc kubenswrapper[4786]: W1002 06:56:23.995063 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb29b896b_61fb_40e7_80ce_d87fc031e3ae.slice/crio-a17e5a4c93ecb152326acfd4588c70a5e68ba2474cac2e77cdccde170c92a75c WatchSource:0}: Error finding container a17e5a4c93ecb152326acfd4588c70a5e68ba2474cac2e77cdccde170c92a75c: Status 404 returned error can't find the container with id a17e5a4c93ecb152326acfd4588c70a5e68ba2474cac2e77cdccde170c92a75c Oct 02 06:56:24 crc kubenswrapper[4786]: I1002 06:56:24.758354 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85846c6c54-9wx2x" event={"ID":"42b1054f-e5f7-4d21-a4a8-98bcb85946c5","Type":"ContainerStarted","Data":"d83e8a2994b6a86e2e98a6f1e2b2d22e9ee006c5efc1df9978cbf890a14aea61"} Oct 02 06:56:24 crc kubenswrapper[4786]: I1002 06:56:24.759471 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6765d85898-4g7pf" event={"ID":"b29b896b-61fb-40e7-80ce-d87fc031e3ae","Type":"ContainerStarted","Data":"a17e5a4c93ecb152326acfd4588c70a5e68ba2474cac2e77cdccde170c92a75c"} Oct 02 06:56:28 crc kubenswrapper[4786]: I1002 06:56:28.790892 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6765d85898-4g7pf" event={"ID":"b29b896b-61fb-40e7-80ce-d87fc031e3ae","Type":"ContainerStarted","Data":"6da6de3ea35aabca357fc9d890e9d9cf107bfb908a6aad83c25d15e3078bbf76"} Oct 02 06:56:28 crc kubenswrapper[4786]: I1002 06:56:28.791578 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6765d85898-4g7pf" Oct 02 06:56:28 crc kubenswrapper[4786]: I1002 06:56:28.792408 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85846c6c54-9wx2x" event={"ID":"42b1054f-e5f7-4d21-a4a8-98bcb85946c5","Type":"ContainerStarted","Data":"b7d86c058c5297dfe6084b1691e9ea7c752ba8a6b7b300bc0d1402aed3f8be4d"} Oct 02 06:56:28 crc kubenswrapper[4786]: I1002 06:56:28.792566 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-85846c6c54-9wx2x" Oct 02 06:56:28 crc kubenswrapper[4786]: I1002 06:56:28.808748 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6765d85898-4g7pf" podStartSLOduration=2.075804116 podStartE2EDuration="5.80873291s" podCreationTimestamp="2025-10-02 06:56:23 +0000 UTC" firstStartedPulling="2025-10-02 06:56:23.998616266 +0000 UTC m=+594.119799397" lastFinishedPulling="2025-10-02 06:56:27.73154506 +0000 UTC m=+597.852728191" observedRunningTime="2025-10-02 06:56:28.805041441 +0000 UTC m=+598.926224592" watchObservedRunningTime="2025-10-02 06:56:28.80873291 +0000 UTC m=+598.929916042" Oct 02 06:56:28 crc kubenswrapper[4786]: I1002 06:56:28.827892 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-85846c6c54-9wx2x" podStartSLOduration=2.038624742 podStartE2EDuration="5.827880231s" podCreationTimestamp="2025-10-02 06:56:23 +0000 UTC" firstStartedPulling="2025-10-02 06:56:23.928482863 +0000 UTC m=+594.049665994" lastFinishedPulling="2025-10-02 06:56:27.717738353 +0000 UTC m=+597.838921483" observedRunningTime="2025-10-02 06:56:28.824711387 +0000 UTC m=+598.945894529" watchObservedRunningTime="2025-10-02 06:56:28.827880231 +0000 UTC m=+598.949063361" Oct 02 06:56:43 crc kubenswrapper[4786]: I1002 06:56:43.827463 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6765d85898-4g7pf" Oct 02 06:57:03 crc kubenswrapper[4786]: I1002 06:57:03.475351 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-85846c6c54-9wx2x" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.017721 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-lc2gd"] Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.019560 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.021509 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-qspcx"] Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.022195 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-qspcx" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.022932 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.023146 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-tm4zw" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.023326 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.023360 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.036705 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-qspcx"] Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.079548 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-6drsl"] Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.080466 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-6drsl" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.082457 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.082627 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-6f2mx" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.082812 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.084239 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.086253 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-4pdp4"] Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.086877 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-4pdp4" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.088064 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.094902 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-4pdp4"] Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.105489 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0678c649-4b0b-4079-865a-7e85f6005a3d-frr-sockets\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.105557 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0678c649-4b0b-4079-865a-7e85f6005a3d-frr-startup\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.105576 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gl2g\" (UniqueName: \"kubernetes.io/projected/0678c649-4b0b-4079-865a-7e85f6005a3d-kube-api-access-4gl2g\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.105605 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0678c649-4b0b-4079-865a-7e85f6005a3d-frr-conf\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.105620 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0678c649-4b0b-4079-865a-7e85f6005a3d-metrics\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.105640 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpbvc\" (UniqueName: \"kubernetes.io/projected/bfefe2c3-8e13-45a3-b700-cda75a37345c-kube-api-access-vpbvc\") pod \"frr-k8s-webhook-server-64bf5d555-qspcx\" (UID: \"bfefe2c3-8e13-45a3-b700-cda75a37345c\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-qspcx" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.105659 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0678c649-4b0b-4079-865a-7e85f6005a3d-metrics-certs\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.105677 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfefe2c3-8e13-45a3-b700-cda75a37345c-cert\") pod \"frr-k8s-webhook-server-64bf5d555-qspcx\" (UID: \"bfefe2c3-8e13-45a3-b700-cda75a37345c\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-qspcx" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.105739 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0678c649-4b0b-4079-865a-7e85f6005a3d-reloader\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.206380 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3b91fb80-e839-4d96-aa9a-4e08642aafe1-metallb-excludel2\") pod \"speaker-6drsl\" (UID: \"3b91fb80-e839-4d96-aa9a-4e08642aafe1\") " pod="metallb-system/speaker-6drsl" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.206426 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0678c649-4b0b-4079-865a-7e85f6005a3d-frr-startup\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.206446 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gl2g\" (UniqueName: \"kubernetes.io/projected/0678c649-4b0b-4079-865a-7e85f6005a3d-kube-api-access-4gl2g\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.206460 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/defc1b9a-20a3-4272-af46-ad01ef957dba-cert\") pod \"controller-68d546b9d8-4pdp4\" (UID: \"defc1b9a-20a3-4272-af46-ad01ef957dba\") " pod="metallb-system/controller-68d546b9d8-4pdp4" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.207590 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0678c649-4b0b-4079-865a-7e85f6005a3d-frr-conf\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.207647 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0678c649-4b0b-4079-865a-7e85f6005a3d-metrics\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.207681 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpbvc\" (UniqueName: \"kubernetes.io/projected/bfefe2c3-8e13-45a3-b700-cda75a37345c-kube-api-access-vpbvc\") pod \"frr-k8s-webhook-server-64bf5d555-qspcx\" (UID: \"bfefe2c3-8e13-45a3-b700-cda75a37345c\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-qspcx" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.207721 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp7hx\" (UniqueName: \"kubernetes.io/projected/3b91fb80-e839-4d96-aa9a-4e08642aafe1-kube-api-access-rp7hx\") pod \"speaker-6drsl\" (UID: \"3b91fb80-e839-4d96-aa9a-4e08642aafe1\") " pod="metallb-system/speaker-6drsl" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.207743 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0678c649-4b0b-4079-865a-7e85f6005a3d-metrics-certs\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.207770 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfefe2c3-8e13-45a3-b700-cda75a37345c-cert\") pod \"frr-k8s-webhook-server-64bf5d555-qspcx\" (UID: \"bfefe2c3-8e13-45a3-b700-cda75a37345c\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-qspcx" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.207850 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3b91fb80-e839-4d96-aa9a-4e08642aafe1-memberlist\") pod \"speaker-6drsl\" (UID: \"3b91fb80-e839-4d96-aa9a-4e08642aafe1\") " pod="metallb-system/speaker-6drsl" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.207869 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0678c649-4b0b-4079-865a-7e85f6005a3d-reloader\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.207884 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/defc1b9a-20a3-4272-af46-ad01ef957dba-metrics-certs\") pod \"controller-68d546b9d8-4pdp4\" (UID: \"defc1b9a-20a3-4272-af46-ad01ef957dba\") " pod="metallb-system/controller-68d546b9d8-4pdp4" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.207914 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0678c649-4b0b-4079-865a-7e85f6005a3d-frr-sockets\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.207942 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3b91fb80-e839-4d96-aa9a-4e08642aafe1-metrics-certs\") pod \"speaker-6drsl\" (UID: \"3b91fb80-e839-4d96-aa9a-4e08642aafe1\") " pod="metallb-system/speaker-6drsl" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.207965 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nznq\" (UniqueName: \"kubernetes.io/projected/defc1b9a-20a3-4272-af46-ad01ef957dba-kube-api-access-9nznq\") pod \"controller-68d546b9d8-4pdp4\" (UID: \"defc1b9a-20a3-4272-af46-ad01ef957dba\") " pod="metallb-system/controller-68d546b9d8-4pdp4" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.208065 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0678c649-4b0b-4079-865a-7e85f6005a3d-frr-conf\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.208096 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0678c649-4b0b-4079-865a-7e85f6005a3d-metrics\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.208333 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0678c649-4b0b-4079-865a-7e85f6005a3d-frr-sockets\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.208366 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0678c649-4b0b-4079-865a-7e85f6005a3d-reloader\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.208940 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0678c649-4b0b-4079-865a-7e85f6005a3d-frr-startup\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.214455 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bfefe2c3-8e13-45a3-b700-cda75a37345c-cert\") pod \"frr-k8s-webhook-server-64bf5d555-qspcx\" (UID: \"bfefe2c3-8e13-45a3-b700-cda75a37345c\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-qspcx" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.217821 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0678c649-4b0b-4079-865a-7e85f6005a3d-metrics-certs\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.224129 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gl2g\" (UniqueName: \"kubernetes.io/projected/0678c649-4b0b-4079-865a-7e85f6005a3d-kube-api-access-4gl2g\") pod \"frr-k8s-lc2gd\" (UID: \"0678c649-4b0b-4079-865a-7e85f6005a3d\") " pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.224228 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpbvc\" (UniqueName: \"kubernetes.io/projected/bfefe2c3-8e13-45a3-b700-cda75a37345c-kube-api-access-vpbvc\") pod \"frr-k8s-webhook-server-64bf5d555-qspcx\" (UID: \"bfefe2c3-8e13-45a3-b700-cda75a37345c\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-qspcx" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.309718 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3b91fb80-e839-4d96-aa9a-4e08642aafe1-memberlist\") pod \"speaker-6drsl\" (UID: \"3b91fb80-e839-4d96-aa9a-4e08642aafe1\") " pod="metallb-system/speaker-6drsl" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.309758 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/defc1b9a-20a3-4272-af46-ad01ef957dba-metrics-certs\") pod \"controller-68d546b9d8-4pdp4\" (UID: \"defc1b9a-20a3-4272-af46-ad01ef957dba\") " pod="metallb-system/controller-68d546b9d8-4pdp4" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.309800 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3b91fb80-e839-4d96-aa9a-4e08642aafe1-metrics-certs\") pod \"speaker-6drsl\" (UID: \"3b91fb80-e839-4d96-aa9a-4e08642aafe1\") " pod="metallb-system/speaker-6drsl" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.309821 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nznq\" (UniqueName: \"kubernetes.io/projected/defc1b9a-20a3-4272-af46-ad01ef957dba-kube-api-access-9nznq\") pod \"controller-68d546b9d8-4pdp4\" (UID: \"defc1b9a-20a3-4272-af46-ad01ef957dba\") " pod="metallb-system/controller-68d546b9d8-4pdp4" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.309845 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3b91fb80-e839-4d96-aa9a-4e08642aafe1-metallb-excludel2\") pod \"speaker-6drsl\" (UID: \"3b91fb80-e839-4d96-aa9a-4e08642aafe1\") " pod="metallb-system/speaker-6drsl" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.309875 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/defc1b9a-20a3-4272-af46-ad01ef957dba-cert\") pod \"controller-68d546b9d8-4pdp4\" (UID: \"defc1b9a-20a3-4272-af46-ad01ef957dba\") " pod="metallb-system/controller-68d546b9d8-4pdp4" Oct 02 06:57:04 crc kubenswrapper[4786]: E1002 06:57:04.309878 4786 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.309915 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp7hx\" (UniqueName: \"kubernetes.io/projected/3b91fb80-e839-4d96-aa9a-4e08642aafe1-kube-api-access-rp7hx\") pod \"speaker-6drsl\" (UID: \"3b91fb80-e839-4d96-aa9a-4e08642aafe1\") " pod="metallb-system/speaker-6drsl" Oct 02 06:57:04 crc kubenswrapper[4786]: E1002 06:57:04.309932 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b91fb80-e839-4d96-aa9a-4e08642aafe1-memberlist podName:3b91fb80-e839-4d96-aa9a-4e08642aafe1 nodeName:}" failed. No retries permitted until 2025-10-02 06:57:04.809918098 +0000 UTC m=+634.931101229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3b91fb80-e839-4d96-aa9a-4e08642aafe1-memberlist") pod "speaker-6drsl" (UID: "3b91fb80-e839-4d96-aa9a-4e08642aafe1") : secret "metallb-memberlist" not found Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.310766 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3b91fb80-e839-4d96-aa9a-4e08642aafe1-metallb-excludel2\") pod \"speaker-6drsl\" (UID: \"3b91fb80-e839-4d96-aa9a-4e08642aafe1\") " pod="metallb-system/speaker-6drsl" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.311390 4786 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.312556 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3b91fb80-e839-4d96-aa9a-4e08642aafe1-metrics-certs\") pod \"speaker-6drsl\" (UID: \"3b91fb80-e839-4d96-aa9a-4e08642aafe1\") " pod="metallb-system/speaker-6drsl" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.313018 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/defc1b9a-20a3-4272-af46-ad01ef957dba-metrics-certs\") pod \"controller-68d546b9d8-4pdp4\" (UID: \"defc1b9a-20a3-4272-af46-ad01ef957dba\") " pod="metallb-system/controller-68d546b9d8-4pdp4" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.322246 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/defc1b9a-20a3-4272-af46-ad01ef957dba-cert\") pod \"controller-68d546b9d8-4pdp4\" (UID: \"defc1b9a-20a3-4272-af46-ad01ef957dba\") " pod="metallb-system/controller-68d546b9d8-4pdp4" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.322350 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nznq\" (UniqueName: \"kubernetes.io/projected/defc1b9a-20a3-4272-af46-ad01ef957dba-kube-api-access-9nznq\") pod \"controller-68d546b9d8-4pdp4\" (UID: \"defc1b9a-20a3-4272-af46-ad01ef957dba\") " pod="metallb-system/controller-68d546b9d8-4pdp4" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.322448 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp7hx\" (UniqueName: \"kubernetes.io/projected/3b91fb80-e839-4d96-aa9a-4e08642aafe1-kube-api-access-rp7hx\") pod \"speaker-6drsl\" (UID: \"3b91fb80-e839-4d96-aa9a-4e08642aafe1\") " pod="metallb-system/speaker-6drsl" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.334337 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.342349 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-qspcx" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.402834 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-4pdp4" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.677263 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-qspcx"] Oct 02 06:57:04 crc kubenswrapper[4786]: W1002 06:57:04.681850 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfefe2c3_8e13_45a3_b700_cda75a37345c.slice/crio-f86affbd3d6e0bc819b7abb85b38b54c43ba4ca760c7d95188c3b03fd8b3c2a3 WatchSource:0}: Error finding container f86affbd3d6e0bc819b7abb85b38b54c43ba4ca760c7d95188c3b03fd8b3c2a3: Status 404 returned error can't find the container with id f86affbd3d6e0bc819b7abb85b38b54c43ba4ca760c7d95188c3b03fd8b3c2a3 Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.733303 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-4pdp4"] Oct 02 06:57:04 crc kubenswrapper[4786]: W1002 06:57:04.735430 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddefc1b9a_20a3_4272_af46_ad01ef957dba.slice/crio-f30e4cae65a307bb023ae9e02bc01aa7b6ed6d1b622a1d1f88604cca969cf657 WatchSource:0}: Error finding container f30e4cae65a307bb023ae9e02bc01aa7b6ed6d1b622a1d1f88604cca969cf657: Status 404 returned error can't find the container with id f30e4cae65a307bb023ae9e02bc01aa7b6ed6d1b622a1d1f88604cca969cf657 Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.815599 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3b91fb80-e839-4d96-aa9a-4e08642aafe1-memberlist\") pod \"speaker-6drsl\" (UID: \"3b91fb80-e839-4d96-aa9a-4e08642aafe1\") " pod="metallb-system/speaker-6drsl" Oct 02 06:57:04 crc kubenswrapper[4786]: E1002 06:57:04.815743 4786 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 02 06:57:04 crc kubenswrapper[4786]: E1002 06:57:04.815792 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b91fb80-e839-4d96-aa9a-4e08642aafe1-memberlist podName:3b91fb80-e839-4d96-aa9a-4e08642aafe1 nodeName:}" failed. No retries permitted until 2025-10-02 06:57:05.815776213 +0000 UTC m=+635.936959344 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3b91fb80-e839-4d96-aa9a-4e08642aafe1-memberlist") pod "speaker-6drsl" (UID: "3b91fb80-e839-4d96-aa9a-4e08642aafe1") : secret "metallb-memberlist" not found Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.943592 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-qspcx" event={"ID":"bfefe2c3-8e13-45a3-b700-cda75a37345c","Type":"ContainerStarted","Data":"f86affbd3d6e0bc819b7abb85b38b54c43ba4ca760c7d95188c3b03fd8b3c2a3"} Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.945046 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-4pdp4" event={"ID":"defc1b9a-20a3-4272-af46-ad01ef957dba","Type":"ContainerStarted","Data":"5a64b2a9d75095d3715d83d8f282fc10a34d9c55c182e0c19633b4366ffd93f8"} Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.945069 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-4pdp4" event={"ID":"defc1b9a-20a3-4272-af46-ad01ef957dba","Type":"ContainerStarted","Data":"8af0796cb08f3d6d5552d6181e68bab590531bf1cc1b7d3a0272e68601f52d0f"} Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.945080 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-4pdp4" event={"ID":"defc1b9a-20a3-4272-af46-ad01ef957dba","Type":"ContainerStarted","Data":"f30e4cae65a307bb023ae9e02bc01aa7b6ed6d1b622a1d1f88604cca969cf657"} Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.945170 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-4pdp4" Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.945960 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lc2gd" event={"ID":"0678c649-4b0b-4079-865a-7e85f6005a3d","Type":"ContainerStarted","Data":"6ee4bb7dc5d352cb73f395ae163daf8342f4926ea1621456b55e46949b3e29b6"} Oct 02 06:57:04 crc kubenswrapper[4786]: I1002 06:57:04.955462 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-4pdp4" podStartSLOduration=0.955446332 podStartE2EDuration="955.446332ms" podCreationTimestamp="2025-10-02 06:57:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:57:04.954784942 +0000 UTC m=+635.075968083" watchObservedRunningTime="2025-10-02 06:57:04.955446332 +0000 UTC m=+635.076629452" Oct 02 06:57:05 crc kubenswrapper[4786]: I1002 06:57:05.826792 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3b91fb80-e839-4d96-aa9a-4e08642aafe1-memberlist\") pod \"speaker-6drsl\" (UID: \"3b91fb80-e839-4d96-aa9a-4e08642aafe1\") " pod="metallb-system/speaker-6drsl" Oct 02 06:57:05 crc kubenswrapper[4786]: I1002 06:57:05.831307 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3b91fb80-e839-4d96-aa9a-4e08642aafe1-memberlist\") pod \"speaker-6drsl\" (UID: \"3b91fb80-e839-4d96-aa9a-4e08642aafe1\") " pod="metallb-system/speaker-6drsl" Oct 02 06:57:05 crc kubenswrapper[4786]: I1002 06:57:05.891121 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-6drsl" Oct 02 06:57:05 crc kubenswrapper[4786]: W1002 06:57:05.904994 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b91fb80_e839_4d96_aa9a_4e08642aafe1.slice/crio-12cbe4fe1f87f2046fa67e2d9327bcf69e26070be474be803eecb2112e8ce571 WatchSource:0}: Error finding container 12cbe4fe1f87f2046fa67e2d9327bcf69e26070be474be803eecb2112e8ce571: Status 404 returned error can't find the container with id 12cbe4fe1f87f2046fa67e2d9327bcf69e26070be474be803eecb2112e8ce571 Oct 02 06:57:05 crc kubenswrapper[4786]: I1002 06:57:05.952023 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6drsl" event={"ID":"3b91fb80-e839-4d96-aa9a-4e08642aafe1","Type":"ContainerStarted","Data":"12cbe4fe1f87f2046fa67e2d9327bcf69e26070be474be803eecb2112e8ce571"} Oct 02 06:57:06 crc kubenswrapper[4786]: I1002 06:57:06.963057 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6drsl" event={"ID":"3b91fb80-e839-4d96-aa9a-4e08642aafe1","Type":"ContainerStarted","Data":"f8d00b4b280314250f63e6a4ef443e5fc19dc2104a1d25728dec488cdc043db9"} Oct 02 06:57:06 crc kubenswrapper[4786]: I1002 06:57:06.963323 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6drsl" event={"ID":"3b91fb80-e839-4d96-aa9a-4e08642aafe1","Type":"ContainerStarted","Data":"aa061b2036ce505d0d4b3827b12b0f73ea1361cbb37f2aaa4017f99e0a48eaaf"} Oct 02 06:57:06 crc kubenswrapper[4786]: I1002 06:57:06.963343 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-6drsl" Oct 02 06:57:06 crc kubenswrapper[4786]: I1002 06:57:06.974286 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-6drsl" podStartSLOduration=2.974269538 podStartE2EDuration="2.974269538s" podCreationTimestamp="2025-10-02 06:57:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:57:06.972942873 +0000 UTC m=+637.094126014" watchObservedRunningTime="2025-10-02 06:57:06.974269538 +0000 UTC m=+637.095452669" Oct 02 06:57:10 crc kubenswrapper[4786]: I1002 06:57:10.991932 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-qspcx" event={"ID":"bfefe2c3-8e13-45a3-b700-cda75a37345c","Type":"ContainerStarted","Data":"076c735bfffc0fc6efd97a10ad22441340f5206784d0745c1ab57dfd64b53b30"} Oct 02 06:57:10 crc kubenswrapper[4786]: I1002 06:57:10.992345 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-qspcx" Oct 02 06:57:10 crc kubenswrapper[4786]: I1002 06:57:10.993795 4786 generic.go:334] "Generic (PLEG): container finished" podID="0678c649-4b0b-4079-865a-7e85f6005a3d" containerID="786b6cb7a780bb31f6130d4759b54f186a055a5d26d0ad989444a1ffc44e7329" exitCode=0 Oct 02 06:57:10 crc kubenswrapper[4786]: I1002 06:57:10.993831 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lc2gd" event={"ID":"0678c649-4b0b-4079-865a-7e85f6005a3d","Type":"ContainerDied","Data":"786b6cb7a780bb31f6130d4759b54f186a055a5d26d0ad989444a1ffc44e7329"} Oct 02 06:57:11 crc kubenswrapper[4786]: I1002 06:57:11.003584 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-qspcx" podStartSLOduration=1.110819718 podStartE2EDuration="7.00357317s" podCreationTimestamp="2025-10-02 06:57:04 +0000 UTC" firstStartedPulling="2025-10-02 06:57:04.684010108 +0000 UTC m=+634.805193239" lastFinishedPulling="2025-10-02 06:57:10.57676356 +0000 UTC m=+640.697946691" observedRunningTime="2025-10-02 06:57:11.001476912 +0000 UTC m=+641.122660063" watchObservedRunningTime="2025-10-02 06:57:11.00357317 +0000 UTC m=+641.124756291" Oct 02 06:57:12 crc kubenswrapper[4786]: I1002 06:57:12.001128 4786 generic.go:334] "Generic (PLEG): container finished" podID="0678c649-4b0b-4079-865a-7e85f6005a3d" containerID="d42b578513a6f861f5353b663b7acc8dd6f70f276fbfb2bca5fe0595138cd2e3" exitCode=0 Oct 02 06:57:12 crc kubenswrapper[4786]: I1002 06:57:12.001214 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lc2gd" event={"ID":"0678c649-4b0b-4079-865a-7e85f6005a3d","Type":"ContainerDied","Data":"d42b578513a6f861f5353b663b7acc8dd6f70f276fbfb2bca5fe0595138cd2e3"} Oct 02 06:57:13 crc kubenswrapper[4786]: I1002 06:57:13.006921 4786 generic.go:334] "Generic (PLEG): container finished" podID="0678c649-4b0b-4079-865a-7e85f6005a3d" containerID="253e5e529bc2452e51fd91f52985678db93d457773eea8bf88b0a321eb937fa0" exitCode=0 Oct 02 06:57:13 crc kubenswrapper[4786]: I1002 06:57:13.007017 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lc2gd" event={"ID":"0678c649-4b0b-4079-865a-7e85f6005a3d","Type":"ContainerDied","Data":"253e5e529bc2452e51fd91f52985678db93d457773eea8bf88b0a321eb937fa0"} Oct 02 06:57:14 crc kubenswrapper[4786]: I1002 06:57:14.014366 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lc2gd" event={"ID":"0678c649-4b0b-4079-865a-7e85f6005a3d","Type":"ContainerStarted","Data":"4154fd93deefb5983796ae57ef471b7162fd56c20809a9353d09eb2da57eac1d"} Oct 02 06:57:14 crc kubenswrapper[4786]: I1002 06:57:14.014578 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lc2gd" event={"ID":"0678c649-4b0b-4079-865a-7e85f6005a3d","Type":"ContainerStarted","Data":"0469711e9395abdce7b694b4568d779bbee24a0ded8b1e1e6f5a07e77d8786b7"} Oct 02 06:57:14 crc kubenswrapper[4786]: I1002 06:57:14.014590 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lc2gd" event={"ID":"0678c649-4b0b-4079-865a-7e85f6005a3d","Type":"ContainerStarted","Data":"b7988dd8b5e555f2645b4db2c7f4d35760956c5f410d0f5001a4176a9322e091"} Oct 02 06:57:14 crc kubenswrapper[4786]: I1002 06:57:14.014603 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:14 crc kubenswrapper[4786]: I1002 06:57:14.014611 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lc2gd" event={"ID":"0678c649-4b0b-4079-865a-7e85f6005a3d","Type":"ContainerStarted","Data":"8c89ddc682c3143b42ca438972d073aa3591a0bca171968b3ff0e23882ced320"} Oct 02 06:57:14 crc kubenswrapper[4786]: I1002 06:57:14.014620 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lc2gd" event={"ID":"0678c649-4b0b-4079-865a-7e85f6005a3d","Type":"ContainerStarted","Data":"67a6f2db565b7b371b5c97529cbe660fb1a75359e17914692cfc45bd639def19"} Oct 02 06:57:14 crc kubenswrapper[4786]: I1002 06:57:14.014627 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-lc2gd" event={"ID":"0678c649-4b0b-4079-865a-7e85f6005a3d","Type":"ContainerStarted","Data":"4adb6058578ce0a888a7806708c5657a8e5d4cfffb576738bfd7327a4262810b"} Oct 02 06:57:14 crc kubenswrapper[4786]: I1002 06:57:14.031150 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-lc2gd" podStartSLOduration=3.856096997 podStartE2EDuration="10.031136968s" podCreationTimestamp="2025-10-02 06:57:04 +0000 UTC" firstStartedPulling="2025-10-02 06:57:04.414520429 +0000 UTC m=+634.535703560" lastFinishedPulling="2025-10-02 06:57:10.5895604 +0000 UTC m=+640.710743531" observedRunningTime="2025-10-02 06:57:14.028791018 +0000 UTC m=+644.149974159" watchObservedRunningTime="2025-10-02 06:57:14.031136968 +0000 UTC m=+644.152320099" Oct 02 06:57:14 crc kubenswrapper[4786]: I1002 06:57:14.335142 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:14 crc kubenswrapper[4786]: I1002 06:57:14.363423 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:14 crc kubenswrapper[4786]: I1002 06:57:14.406303 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-4pdp4" Oct 02 06:57:24 crc kubenswrapper[4786]: I1002 06:57:24.337310 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-lc2gd" Oct 02 06:57:24 crc kubenswrapper[4786]: I1002 06:57:24.391389 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-qspcx" Oct 02 06:57:25 crc kubenswrapper[4786]: I1002 06:57:25.894200 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-6drsl" Oct 02 06:57:28 crc kubenswrapper[4786]: I1002 06:57:28.003038 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-grfsw"] Oct 02 06:57:28 crc kubenswrapper[4786]: I1002 06:57:28.003819 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-grfsw" Oct 02 06:57:28 crc kubenswrapper[4786]: I1002 06:57:28.005253 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-5wxm2" Oct 02 06:57:28 crc kubenswrapper[4786]: I1002 06:57:28.005316 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 02 06:57:28 crc kubenswrapper[4786]: I1002 06:57:28.005660 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 02 06:57:28 crc kubenswrapper[4786]: I1002 06:57:28.012347 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-grfsw"] Oct 02 06:57:28 crc kubenswrapper[4786]: I1002 06:57:28.095737 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r4ln\" (UniqueName: \"kubernetes.io/projected/025bc6f4-7160-494a-8717-a8db11cbbc18-kube-api-access-2r4ln\") pod \"openstack-operator-index-grfsw\" (UID: \"025bc6f4-7160-494a-8717-a8db11cbbc18\") " pod="openstack-operators/openstack-operator-index-grfsw" Oct 02 06:57:28 crc kubenswrapper[4786]: I1002 06:57:28.196519 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r4ln\" (UniqueName: \"kubernetes.io/projected/025bc6f4-7160-494a-8717-a8db11cbbc18-kube-api-access-2r4ln\") pod \"openstack-operator-index-grfsw\" (UID: \"025bc6f4-7160-494a-8717-a8db11cbbc18\") " pod="openstack-operators/openstack-operator-index-grfsw" Oct 02 06:57:28 crc kubenswrapper[4786]: I1002 06:57:28.210684 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r4ln\" (UniqueName: \"kubernetes.io/projected/025bc6f4-7160-494a-8717-a8db11cbbc18-kube-api-access-2r4ln\") pod \"openstack-operator-index-grfsw\" (UID: \"025bc6f4-7160-494a-8717-a8db11cbbc18\") " pod="openstack-operators/openstack-operator-index-grfsw" Oct 02 06:57:28 crc kubenswrapper[4786]: I1002 06:57:28.320040 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-grfsw" Oct 02 06:57:28 crc kubenswrapper[4786]: I1002 06:57:28.655959 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-grfsw"] Oct 02 06:57:28 crc kubenswrapper[4786]: W1002 06:57:28.660415 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod025bc6f4_7160_494a_8717_a8db11cbbc18.slice/crio-c1d129f3de3e21dd005f6e2de14d863657fde48c00875e72267c0518be0e817f WatchSource:0}: Error finding container c1d129f3de3e21dd005f6e2de14d863657fde48c00875e72267c0518be0e817f: Status 404 returned error can't find the container with id c1d129f3de3e21dd005f6e2de14d863657fde48c00875e72267c0518be0e817f Oct 02 06:57:29 crc kubenswrapper[4786]: I1002 06:57:29.079308 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-grfsw" event={"ID":"025bc6f4-7160-494a-8717-a8db11cbbc18","Type":"ContainerStarted","Data":"c1d129f3de3e21dd005f6e2de14d863657fde48c00875e72267c0518be0e817f"} Oct 02 06:57:30 crc kubenswrapper[4786]: I1002 06:57:30.085283 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-grfsw" event={"ID":"025bc6f4-7160-494a-8717-a8db11cbbc18","Type":"ContainerStarted","Data":"adebd8d96848eafb7cdef3f40c3cfd0fc6defc3fb79f75b932e567252ce7e1f1"} Oct 02 06:57:30 crc kubenswrapper[4786]: I1002 06:57:30.098509 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-grfsw" podStartSLOduration=2.115363703 podStartE2EDuration="3.09849448s" podCreationTimestamp="2025-10-02 06:57:27 +0000 UTC" firstStartedPulling="2025-10-02 06:57:28.661943841 +0000 UTC m=+658.783126972" lastFinishedPulling="2025-10-02 06:57:29.645074618 +0000 UTC m=+659.766257749" observedRunningTime="2025-10-02 06:57:30.096620565 +0000 UTC m=+660.217803696" watchObservedRunningTime="2025-10-02 06:57:30.09849448 +0000 UTC m=+660.219677612" Oct 02 06:57:31 crc kubenswrapper[4786]: I1002 06:57:31.395083 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-grfsw"] Oct 02 06:57:31 crc kubenswrapper[4786]: I1002 06:57:31.998039 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-l4f42"] Oct 02 06:57:31 crc kubenswrapper[4786]: I1002 06:57:31.998667 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-l4f42" Oct 02 06:57:32 crc kubenswrapper[4786]: I1002 06:57:32.002988 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-l4f42"] Oct 02 06:57:32 crc kubenswrapper[4786]: I1002 06:57:32.094124 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-grfsw" podUID="025bc6f4-7160-494a-8717-a8db11cbbc18" containerName="registry-server" containerID="cri-o://adebd8d96848eafb7cdef3f40c3cfd0fc6defc3fb79f75b932e567252ce7e1f1" gracePeriod=2 Oct 02 06:57:32 crc kubenswrapper[4786]: I1002 06:57:32.142250 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm7tm\" (UniqueName: \"kubernetes.io/projected/d3794fc6-7388-43e3-bf57-545f152e19c4-kube-api-access-sm7tm\") pod \"openstack-operator-index-l4f42\" (UID: \"d3794fc6-7388-43e3-bf57-545f152e19c4\") " pod="openstack-operators/openstack-operator-index-l4f42" Oct 02 06:57:32 crc kubenswrapper[4786]: I1002 06:57:32.243738 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm7tm\" (UniqueName: \"kubernetes.io/projected/d3794fc6-7388-43e3-bf57-545f152e19c4-kube-api-access-sm7tm\") pod \"openstack-operator-index-l4f42\" (UID: \"d3794fc6-7388-43e3-bf57-545f152e19c4\") " pod="openstack-operators/openstack-operator-index-l4f42" Oct 02 06:57:32 crc kubenswrapper[4786]: I1002 06:57:32.259021 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm7tm\" (UniqueName: \"kubernetes.io/projected/d3794fc6-7388-43e3-bf57-545f152e19c4-kube-api-access-sm7tm\") pod \"openstack-operator-index-l4f42\" (UID: \"d3794fc6-7388-43e3-bf57-545f152e19c4\") " pod="openstack-operators/openstack-operator-index-l4f42" Oct 02 06:57:32 crc kubenswrapper[4786]: I1002 06:57:32.310564 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-l4f42" Oct 02 06:57:32 crc kubenswrapper[4786]: I1002 06:57:32.372252 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-grfsw" Oct 02 06:57:32 crc kubenswrapper[4786]: I1002 06:57:32.446114 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r4ln\" (UniqueName: \"kubernetes.io/projected/025bc6f4-7160-494a-8717-a8db11cbbc18-kube-api-access-2r4ln\") pod \"025bc6f4-7160-494a-8717-a8db11cbbc18\" (UID: \"025bc6f4-7160-494a-8717-a8db11cbbc18\") " Oct 02 06:57:32 crc kubenswrapper[4786]: I1002 06:57:32.449436 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/025bc6f4-7160-494a-8717-a8db11cbbc18-kube-api-access-2r4ln" (OuterVolumeSpecName: "kube-api-access-2r4ln") pod "025bc6f4-7160-494a-8717-a8db11cbbc18" (UID: "025bc6f4-7160-494a-8717-a8db11cbbc18"). InnerVolumeSpecName "kube-api-access-2r4ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:57:32 crc kubenswrapper[4786]: I1002 06:57:32.547326 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r4ln\" (UniqueName: \"kubernetes.io/projected/025bc6f4-7160-494a-8717-a8db11cbbc18-kube-api-access-2r4ln\") on node \"crc\" DevicePath \"\"" Oct 02 06:57:32 crc kubenswrapper[4786]: I1002 06:57:32.640464 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-l4f42"] Oct 02 06:57:32 crc kubenswrapper[4786]: W1002 06:57:32.645596 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3794fc6_7388_43e3_bf57_545f152e19c4.slice/crio-cace02fc87da688d06d90a106c94bf7ccfc04f98df355bec01fcf30cbdcc1605 WatchSource:0}: Error finding container cace02fc87da688d06d90a106c94bf7ccfc04f98df355bec01fcf30cbdcc1605: Status 404 returned error can't find the container with id cace02fc87da688d06d90a106c94bf7ccfc04f98df355bec01fcf30cbdcc1605 Oct 02 06:57:33 crc kubenswrapper[4786]: I1002 06:57:33.098760 4786 generic.go:334] "Generic (PLEG): container finished" podID="025bc6f4-7160-494a-8717-a8db11cbbc18" containerID="adebd8d96848eafb7cdef3f40c3cfd0fc6defc3fb79f75b932e567252ce7e1f1" exitCode=0 Oct 02 06:57:33 crc kubenswrapper[4786]: I1002 06:57:33.098799 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-grfsw" event={"ID":"025bc6f4-7160-494a-8717-a8db11cbbc18","Type":"ContainerDied","Data":"adebd8d96848eafb7cdef3f40c3cfd0fc6defc3fb79f75b932e567252ce7e1f1"} Oct 02 06:57:33 crc kubenswrapper[4786]: I1002 06:57:33.098815 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-grfsw" Oct 02 06:57:33 crc kubenswrapper[4786]: I1002 06:57:33.098831 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-grfsw" event={"ID":"025bc6f4-7160-494a-8717-a8db11cbbc18","Type":"ContainerDied","Data":"c1d129f3de3e21dd005f6e2de14d863657fde48c00875e72267c0518be0e817f"} Oct 02 06:57:33 crc kubenswrapper[4786]: I1002 06:57:33.098848 4786 scope.go:117] "RemoveContainer" containerID="adebd8d96848eafb7cdef3f40c3cfd0fc6defc3fb79f75b932e567252ce7e1f1" Oct 02 06:57:33 crc kubenswrapper[4786]: I1002 06:57:33.099965 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-l4f42" event={"ID":"d3794fc6-7388-43e3-bf57-545f152e19c4","Type":"ContainerStarted","Data":"cace02fc87da688d06d90a106c94bf7ccfc04f98df355bec01fcf30cbdcc1605"} Oct 02 06:57:33 crc kubenswrapper[4786]: I1002 06:57:33.109921 4786 scope.go:117] "RemoveContainer" containerID="adebd8d96848eafb7cdef3f40c3cfd0fc6defc3fb79f75b932e567252ce7e1f1" Oct 02 06:57:33 crc kubenswrapper[4786]: E1002 06:57:33.110184 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adebd8d96848eafb7cdef3f40c3cfd0fc6defc3fb79f75b932e567252ce7e1f1\": container with ID starting with adebd8d96848eafb7cdef3f40c3cfd0fc6defc3fb79f75b932e567252ce7e1f1 not found: ID does not exist" containerID="adebd8d96848eafb7cdef3f40c3cfd0fc6defc3fb79f75b932e567252ce7e1f1" Oct 02 06:57:33 crc kubenswrapper[4786]: I1002 06:57:33.110216 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adebd8d96848eafb7cdef3f40c3cfd0fc6defc3fb79f75b932e567252ce7e1f1"} err="failed to get container status \"adebd8d96848eafb7cdef3f40c3cfd0fc6defc3fb79f75b932e567252ce7e1f1\": rpc error: code = NotFound desc = could not find container \"adebd8d96848eafb7cdef3f40c3cfd0fc6defc3fb79f75b932e567252ce7e1f1\": container with ID starting with adebd8d96848eafb7cdef3f40c3cfd0fc6defc3fb79f75b932e567252ce7e1f1 not found: ID does not exist" Oct 02 06:57:33 crc kubenswrapper[4786]: I1002 06:57:33.116529 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-grfsw"] Oct 02 06:57:33 crc kubenswrapper[4786]: I1002 06:57:33.119303 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-grfsw"] Oct 02 06:57:34 crc kubenswrapper[4786]: I1002 06:57:34.105127 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-l4f42" event={"ID":"d3794fc6-7388-43e3-bf57-545f152e19c4","Type":"ContainerStarted","Data":"06f2020436ac4f2035e2ae4f3f5da5ea441637f92f2101ca1ea77b65b90429be"} Oct 02 06:57:34 crc kubenswrapper[4786]: I1002 06:57:34.114970 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-l4f42" podStartSLOduration=2.6201152370000003 podStartE2EDuration="3.114957307s" podCreationTimestamp="2025-10-02 06:57:31 +0000 UTC" firstStartedPulling="2025-10-02 06:57:32.648395981 +0000 UTC m=+662.769579111" lastFinishedPulling="2025-10-02 06:57:33.14323805 +0000 UTC m=+663.264421181" observedRunningTime="2025-10-02 06:57:34.113132545 +0000 UTC m=+664.234315686" watchObservedRunningTime="2025-10-02 06:57:34.114957307 +0000 UTC m=+664.236140439" Oct 02 06:57:34 crc kubenswrapper[4786]: I1002 06:57:34.184468 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="025bc6f4-7160-494a-8717-a8db11cbbc18" path="/var/lib/kubelet/pods/025bc6f4-7160-494a-8717-a8db11cbbc18/volumes" Oct 02 06:57:42 crc kubenswrapper[4786]: I1002 06:57:42.310796 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-l4f42" Oct 02 06:57:42 crc kubenswrapper[4786]: I1002 06:57:42.311263 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-l4f42" Oct 02 06:57:42 crc kubenswrapper[4786]: I1002 06:57:42.330900 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-l4f42" Oct 02 06:57:43 crc kubenswrapper[4786]: I1002 06:57:43.158204 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-l4f42" Oct 02 06:57:43 crc kubenswrapper[4786]: I1002 06:57:43.619979 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n"] Oct 02 06:57:43 crc kubenswrapper[4786]: E1002 06:57:43.620198 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025bc6f4-7160-494a-8717-a8db11cbbc18" containerName="registry-server" Oct 02 06:57:43 crc kubenswrapper[4786]: I1002 06:57:43.620210 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="025bc6f4-7160-494a-8717-a8db11cbbc18" containerName="registry-server" Oct 02 06:57:43 crc kubenswrapper[4786]: I1002 06:57:43.620321 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="025bc6f4-7160-494a-8717-a8db11cbbc18" containerName="registry-server" Oct 02 06:57:43 crc kubenswrapper[4786]: I1002 06:57:43.620966 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n" Oct 02 06:57:43 crc kubenswrapper[4786]: I1002 06:57:43.622312 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-wvphz" Oct 02 06:57:43 crc kubenswrapper[4786]: I1002 06:57:43.625390 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n"] Oct 02 06:57:43 crc kubenswrapper[4786]: I1002 06:57:43.751654 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78d8253b-165c-4e4d-9c1a-2a22c6828e08-bundle\") pod \"0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n\" (UID: \"78d8253b-165c-4e4d-9c1a-2a22c6828e08\") " pod="openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n" Oct 02 06:57:43 crc kubenswrapper[4786]: I1002 06:57:43.751770 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78d8253b-165c-4e4d-9c1a-2a22c6828e08-util\") pod \"0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n\" (UID: \"78d8253b-165c-4e4d-9c1a-2a22c6828e08\") " pod="openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n" Oct 02 06:57:43 crc kubenswrapper[4786]: I1002 06:57:43.751805 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqvps\" (UniqueName: \"kubernetes.io/projected/78d8253b-165c-4e4d-9c1a-2a22c6828e08-kube-api-access-lqvps\") pod \"0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n\" (UID: \"78d8253b-165c-4e4d-9c1a-2a22c6828e08\") " pod="openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n" Oct 02 06:57:43 crc kubenswrapper[4786]: I1002 06:57:43.852895 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78d8253b-165c-4e4d-9c1a-2a22c6828e08-util\") pod \"0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n\" (UID: \"78d8253b-165c-4e4d-9c1a-2a22c6828e08\") " pod="openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n" Oct 02 06:57:43 crc kubenswrapper[4786]: I1002 06:57:43.852941 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqvps\" (UniqueName: \"kubernetes.io/projected/78d8253b-165c-4e4d-9c1a-2a22c6828e08-kube-api-access-lqvps\") pod \"0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n\" (UID: \"78d8253b-165c-4e4d-9c1a-2a22c6828e08\") " pod="openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n" Oct 02 06:57:43 crc kubenswrapper[4786]: I1002 06:57:43.852987 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78d8253b-165c-4e4d-9c1a-2a22c6828e08-bundle\") pod \"0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n\" (UID: \"78d8253b-165c-4e4d-9c1a-2a22c6828e08\") " pod="openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n" Oct 02 06:57:43 crc kubenswrapper[4786]: I1002 06:57:43.853378 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78d8253b-165c-4e4d-9c1a-2a22c6828e08-util\") pod \"0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n\" (UID: \"78d8253b-165c-4e4d-9c1a-2a22c6828e08\") " pod="openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n" Oct 02 06:57:43 crc kubenswrapper[4786]: I1002 06:57:43.853414 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78d8253b-165c-4e4d-9c1a-2a22c6828e08-bundle\") pod \"0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n\" (UID: \"78d8253b-165c-4e4d-9c1a-2a22c6828e08\") " pod="openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n" Oct 02 06:57:43 crc kubenswrapper[4786]: I1002 06:57:43.867515 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqvps\" (UniqueName: \"kubernetes.io/projected/78d8253b-165c-4e4d-9c1a-2a22c6828e08-kube-api-access-lqvps\") pod \"0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n\" (UID: \"78d8253b-165c-4e4d-9c1a-2a22c6828e08\") " pod="openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n" Oct 02 06:57:43 crc kubenswrapper[4786]: I1002 06:57:43.933535 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n" Oct 02 06:57:44 crc kubenswrapper[4786]: I1002 06:57:44.284509 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n"] Oct 02 06:57:44 crc kubenswrapper[4786]: W1002 06:57:44.288564 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78d8253b_165c_4e4d_9c1a_2a22c6828e08.slice/crio-d0b46251179cdfd61b0c05c5894eaafebfb51edff8944fea47c951703641e22a WatchSource:0}: Error finding container d0b46251179cdfd61b0c05c5894eaafebfb51edff8944fea47c951703641e22a: Status 404 returned error can't find the container with id d0b46251179cdfd61b0c05c5894eaafebfb51edff8944fea47c951703641e22a Oct 02 06:57:45 crc kubenswrapper[4786]: I1002 06:57:45.149183 4786 generic.go:334] "Generic (PLEG): container finished" podID="78d8253b-165c-4e4d-9c1a-2a22c6828e08" containerID="1fb112c78481217e76d8c26b83a6fcef4d80dd766af37cd31cf6d5ce46e3b1f5" exitCode=0 Oct 02 06:57:45 crc kubenswrapper[4786]: I1002 06:57:45.149231 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n" event={"ID":"78d8253b-165c-4e4d-9c1a-2a22c6828e08","Type":"ContainerDied","Data":"1fb112c78481217e76d8c26b83a6fcef4d80dd766af37cd31cf6d5ce46e3b1f5"} Oct 02 06:57:45 crc kubenswrapper[4786]: I1002 06:57:45.149352 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n" event={"ID":"78d8253b-165c-4e4d-9c1a-2a22c6828e08","Type":"ContainerStarted","Data":"d0b46251179cdfd61b0c05c5894eaafebfb51edff8944fea47c951703641e22a"} Oct 02 06:57:46 crc kubenswrapper[4786]: I1002 06:57:46.154447 4786 generic.go:334] "Generic (PLEG): container finished" podID="78d8253b-165c-4e4d-9c1a-2a22c6828e08" containerID="2640d33b382069c37904ea02ebf3292f21920aae70abb6bb98e19b7c981b2dbf" exitCode=0 Oct 02 06:57:46 crc kubenswrapper[4786]: I1002 06:57:46.154673 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n" event={"ID":"78d8253b-165c-4e4d-9c1a-2a22c6828e08","Type":"ContainerDied","Data":"2640d33b382069c37904ea02ebf3292f21920aae70abb6bb98e19b7c981b2dbf"} Oct 02 06:57:47 crc kubenswrapper[4786]: I1002 06:57:47.161007 4786 generic.go:334] "Generic (PLEG): container finished" podID="78d8253b-165c-4e4d-9c1a-2a22c6828e08" containerID="dc3d4b429b8269f9843f59551d02c28f2ab85de9acad6c34df8092dc2baaf38a" exitCode=0 Oct 02 06:57:47 crc kubenswrapper[4786]: I1002 06:57:47.161038 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n" event={"ID":"78d8253b-165c-4e4d-9c1a-2a22c6828e08","Type":"ContainerDied","Data":"dc3d4b429b8269f9843f59551d02c28f2ab85de9acad6c34df8092dc2baaf38a"} Oct 02 06:57:48 crc kubenswrapper[4786]: I1002 06:57:48.341023 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n" Oct 02 06:57:48 crc kubenswrapper[4786]: I1002 06:57:48.501885 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqvps\" (UniqueName: \"kubernetes.io/projected/78d8253b-165c-4e4d-9c1a-2a22c6828e08-kube-api-access-lqvps\") pod \"78d8253b-165c-4e4d-9c1a-2a22c6828e08\" (UID: \"78d8253b-165c-4e4d-9c1a-2a22c6828e08\") " Oct 02 06:57:48 crc kubenswrapper[4786]: I1002 06:57:48.501926 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78d8253b-165c-4e4d-9c1a-2a22c6828e08-bundle\") pod \"78d8253b-165c-4e4d-9c1a-2a22c6828e08\" (UID: \"78d8253b-165c-4e4d-9c1a-2a22c6828e08\") " Oct 02 06:57:48 crc kubenswrapper[4786]: I1002 06:57:48.501995 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78d8253b-165c-4e4d-9c1a-2a22c6828e08-util\") pod \"78d8253b-165c-4e4d-9c1a-2a22c6828e08\" (UID: \"78d8253b-165c-4e4d-9c1a-2a22c6828e08\") " Oct 02 06:57:48 crc kubenswrapper[4786]: I1002 06:57:48.502417 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78d8253b-165c-4e4d-9c1a-2a22c6828e08-bundle" (OuterVolumeSpecName: "bundle") pod "78d8253b-165c-4e4d-9c1a-2a22c6828e08" (UID: "78d8253b-165c-4e4d-9c1a-2a22c6828e08"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:57:48 crc kubenswrapper[4786]: I1002 06:57:48.505660 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78d8253b-165c-4e4d-9c1a-2a22c6828e08-kube-api-access-lqvps" (OuterVolumeSpecName: "kube-api-access-lqvps") pod "78d8253b-165c-4e4d-9c1a-2a22c6828e08" (UID: "78d8253b-165c-4e4d-9c1a-2a22c6828e08"). InnerVolumeSpecName "kube-api-access-lqvps". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:57:48 crc kubenswrapper[4786]: I1002 06:57:48.511889 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78d8253b-165c-4e4d-9c1a-2a22c6828e08-util" (OuterVolumeSpecName: "util") pod "78d8253b-165c-4e4d-9c1a-2a22c6828e08" (UID: "78d8253b-165c-4e4d-9c1a-2a22c6828e08"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:57:48 crc kubenswrapper[4786]: I1002 06:57:48.602857 4786 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/78d8253b-165c-4e4d-9c1a-2a22c6828e08-util\") on node \"crc\" DevicePath \"\"" Oct 02 06:57:48 crc kubenswrapper[4786]: I1002 06:57:48.602881 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqvps\" (UniqueName: \"kubernetes.io/projected/78d8253b-165c-4e4d-9c1a-2a22c6828e08-kube-api-access-lqvps\") on node \"crc\" DevicePath \"\"" Oct 02 06:57:48 crc kubenswrapper[4786]: I1002 06:57:48.602892 4786 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/78d8253b-165c-4e4d-9c1a-2a22c6828e08-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 06:57:49 crc kubenswrapper[4786]: I1002 06:57:49.172076 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n" event={"ID":"78d8253b-165c-4e4d-9c1a-2a22c6828e08","Type":"ContainerDied","Data":"d0b46251179cdfd61b0c05c5894eaafebfb51edff8944fea47c951703641e22a"} Oct 02 06:57:49 crc kubenswrapper[4786]: I1002 06:57:49.172276 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0b46251179cdfd61b0c05c5894eaafebfb51edff8944fea47c951703641e22a" Oct 02 06:57:49 crc kubenswrapper[4786]: I1002 06:57:49.172128 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n" Oct 02 06:57:51 crc kubenswrapper[4786]: I1002 06:57:51.072220 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-859455d779-ksrkf"] Oct 02 06:57:51 crc kubenswrapper[4786]: E1002 06:57:51.072605 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d8253b-165c-4e4d-9c1a-2a22c6828e08" containerName="pull" Oct 02 06:57:51 crc kubenswrapper[4786]: I1002 06:57:51.072615 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d8253b-165c-4e4d-9c1a-2a22c6828e08" containerName="pull" Oct 02 06:57:51 crc kubenswrapper[4786]: E1002 06:57:51.072630 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d8253b-165c-4e4d-9c1a-2a22c6828e08" containerName="util" Oct 02 06:57:51 crc kubenswrapper[4786]: I1002 06:57:51.072635 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d8253b-165c-4e4d-9c1a-2a22c6828e08" containerName="util" Oct 02 06:57:51 crc kubenswrapper[4786]: E1002 06:57:51.072650 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d8253b-165c-4e4d-9c1a-2a22c6828e08" containerName="extract" Oct 02 06:57:51 crc kubenswrapper[4786]: I1002 06:57:51.072656 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d8253b-165c-4e4d-9c1a-2a22c6828e08" containerName="extract" Oct 02 06:57:51 crc kubenswrapper[4786]: I1002 06:57:51.072759 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="78d8253b-165c-4e4d-9c1a-2a22c6828e08" containerName="extract" Oct 02 06:57:51 crc kubenswrapper[4786]: I1002 06:57:51.073267 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-859455d779-ksrkf" Oct 02 06:57:51 crc kubenswrapper[4786]: I1002 06:57:51.075903 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-7cqtl" Oct 02 06:57:51 crc kubenswrapper[4786]: I1002 06:57:51.089607 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-859455d779-ksrkf"] Oct 02 06:57:51 crc kubenswrapper[4786]: I1002 06:57:51.231299 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf4pw\" (UniqueName: \"kubernetes.io/projected/f1ba6077-2c56-40f9-bcbc-64bae09186c6-kube-api-access-pf4pw\") pod \"openstack-operator-controller-operator-859455d779-ksrkf\" (UID: \"f1ba6077-2c56-40f9-bcbc-64bae09186c6\") " pod="openstack-operators/openstack-operator-controller-operator-859455d779-ksrkf" Oct 02 06:57:51 crc kubenswrapper[4786]: I1002 06:57:51.332591 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf4pw\" (UniqueName: \"kubernetes.io/projected/f1ba6077-2c56-40f9-bcbc-64bae09186c6-kube-api-access-pf4pw\") pod \"openstack-operator-controller-operator-859455d779-ksrkf\" (UID: \"f1ba6077-2c56-40f9-bcbc-64bae09186c6\") " pod="openstack-operators/openstack-operator-controller-operator-859455d779-ksrkf" Oct 02 06:57:51 crc kubenswrapper[4786]: I1002 06:57:51.347837 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf4pw\" (UniqueName: \"kubernetes.io/projected/f1ba6077-2c56-40f9-bcbc-64bae09186c6-kube-api-access-pf4pw\") pod \"openstack-operator-controller-operator-859455d779-ksrkf\" (UID: \"f1ba6077-2c56-40f9-bcbc-64bae09186c6\") " pod="openstack-operators/openstack-operator-controller-operator-859455d779-ksrkf" Oct 02 06:57:51 crc kubenswrapper[4786]: I1002 06:57:51.386160 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-859455d779-ksrkf" Oct 02 06:57:51 crc kubenswrapper[4786]: I1002 06:57:51.735532 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-859455d779-ksrkf"] Oct 02 06:57:52 crc kubenswrapper[4786]: I1002 06:57:52.186928 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-859455d779-ksrkf" event={"ID":"f1ba6077-2c56-40f9-bcbc-64bae09186c6","Type":"ContainerStarted","Data":"7309ca039bc4fdfba9fb2c261effa3f30bb73d464660981dbf0f8e932febfe67"} Oct 02 06:57:55 crc kubenswrapper[4786]: I1002 06:57:55.199751 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-859455d779-ksrkf" event={"ID":"f1ba6077-2c56-40f9-bcbc-64bae09186c6","Type":"ContainerStarted","Data":"5c052e1d1c6220d73cbec739a95b54dc98e24823ef1edba4df11ac5977992e85"} Oct 02 06:57:57 crc kubenswrapper[4786]: I1002 06:57:57.210431 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-859455d779-ksrkf" event={"ID":"f1ba6077-2c56-40f9-bcbc-64bae09186c6","Type":"ContainerStarted","Data":"0a512bb5972400a19cc525b978432f5edc702652b76f073e824a42da75e2bfc3"} Oct 02 06:57:57 crc kubenswrapper[4786]: I1002 06:57:57.210663 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-859455d779-ksrkf" Oct 02 06:57:57 crc kubenswrapper[4786]: I1002 06:57:57.231229 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-859455d779-ksrkf" podStartSLOduration=1.572627185 podStartE2EDuration="6.231218715s" podCreationTimestamp="2025-10-02 06:57:51 +0000 UTC" firstStartedPulling="2025-10-02 06:57:51.742261096 +0000 UTC m=+681.863444227" lastFinishedPulling="2025-10-02 06:57:56.400852625 +0000 UTC m=+686.522035757" observedRunningTime="2025-10-02 06:57:57.228398826 +0000 UTC m=+687.349581967" watchObservedRunningTime="2025-10-02 06:57:57.231218715 +0000 UTC m=+687.352401847" Oct 02 06:57:57 crc kubenswrapper[4786]: I1002 06:57:57.497034 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 06:57:57 crc kubenswrapper[4786]: I1002 06:57:57.497083 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 06:58:01 crc kubenswrapper[4786]: I1002 06:58:01.388470 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-859455d779-ksrkf" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.245374 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859cd486d-5b4mb"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.246608 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-5b4mb" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.248075 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-f7f98cb69-dkqvh"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.248830 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-dkqvh" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.250596 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-bwkqj" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.251781 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-xg4xq" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.268532 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859cd486d-5b4mb"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.271071 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-f7f98cb69-dkqvh"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.281580 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-77fb7bcf5b-4rqms"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.282445 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-4rqms" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.283435 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-jt8nc" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.284708 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8bc4775b5-w6s6f"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.285447 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-w6s6f" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.286425 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b4fc86755-xqp2m"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.288414 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-44g4w" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.288551 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-xqp2m" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.289544 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-vnkjb" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.299102 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8bc4775b5-w6s6f"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.302046 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-77fb7bcf5b-4rqms"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.318099 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-679b4759bb-xcd8d"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.318782 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b4fc86755-xqp2m"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.318804 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.319369 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.319677 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-xcd8d" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.320951 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-52bl7" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.321185 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-c9vvt" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.321186 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.322128 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-679b4759bb-xcd8d"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.333305 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f45cd594f-m99xq"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.334076 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-m99xq" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.336271 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-8znjn" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.344447 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.360213 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-59d7dc95cf-jf2qh"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.369645 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-jf2qh" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.378974 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-c7v95" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.379424 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f45cd594f-m99xq"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.389187 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-b7cf8cb5f-x6hzs"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.391077 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-x6hzs" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.393002 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-dlqkm" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.396115 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-59d7dc95cf-jf2qh"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.401108 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-b7cf8cb5f-x6hzs"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.404931 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf5bb885-7xngz"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.405853 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-7xngz" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.407305 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-vhfrn" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.413019 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf5bb885-7xngz"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.418457 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54fbbfcd44-jpgjv"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.419295 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-jpgjv" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.420265 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kf7f\" (UniqueName: \"kubernetes.io/projected/458e49e3-37fa-4fde-b98e-35f6490ad3bc-kube-api-access-9kf7f\") pod \"heat-operator-controller-manager-5b4fc86755-xqp2m\" (UID: \"458e49e3-37fa-4fde-b98e-35f6490ad3bc\") " pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-xqp2m" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.422048 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-8cfbh"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.422215 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-spshm" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.422874 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-8cfbh" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.427617 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-8cfbh"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.427806 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-v62xd" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.428574 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/903d28be-eebf-4dd0-bd15-3f3e2a9416bf-cert\") pod \"infra-operator-controller-manager-5c8fdc4d5c-5vzvp\" (UID: \"903d28be-eebf-4dd0-bd15-3f3e2a9416bf\") " pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.428629 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4585\" (UniqueName: \"kubernetes.io/projected/8c50728d-dcc7-4cc1-a0fb-ef8c1355d6f9-kube-api-access-j4585\") pod \"ironic-operator-controller-manager-5f45cd594f-m99xq\" (UID: \"8c50728d-dcc7-4cc1-a0fb-ef8c1355d6f9\") " pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-m99xq" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.428672 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mv8d\" (UniqueName: \"kubernetes.io/projected/e229888c-d55e-4b2f-a53a-e588868f98e2-kube-api-access-9mv8d\") pod \"designate-operator-controller-manager-77fb7bcf5b-4rqms\" (UID: \"e229888c-d55e-4b2f-a53a-e588868f98e2\") " pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-4rqms" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.428708 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsmwc\" (UniqueName: \"kubernetes.io/projected/0872cd5d-1fde-4b27-bdb7-6eade27cee9d-kube-api-access-fsmwc\") pod \"barbican-operator-controller-manager-f7f98cb69-dkqvh\" (UID: \"0872cd5d-1fde-4b27-bdb7-6eade27cee9d\") " pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-dkqvh" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.428759 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsdqs\" (UniqueName: \"kubernetes.io/projected/9c27556b-9c6e-4a2d-9c2d-78f471392a85-kube-api-access-lsdqs\") pod \"horizon-operator-controller-manager-679b4759bb-xcd8d\" (UID: \"9c27556b-9c6e-4a2d-9c2d-78f471392a85\") " pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-xcd8d" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.428797 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bzgc\" (UniqueName: \"kubernetes.io/projected/903d28be-eebf-4dd0-bd15-3f3e2a9416bf-kube-api-access-5bzgc\") pod \"infra-operator-controller-manager-5c8fdc4d5c-5vzvp\" (UID: \"903d28be-eebf-4dd0-bd15-3f3e2a9416bf\") " pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.428825 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcwn7\" (UniqueName: \"kubernetes.io/projected/40d01a1d-0613-43eb-824b-24b22a879822-kube-api-access-lcwn7\") pod \"glance-operator-controller-manager-8bc4775b5-w6s6f\" (UID: \"40d01a1d-0613-43eb-824b-24b22a879822\") " pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-w6s6f" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.428856 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hpdx\" (UniqueName: \"kubernetes.io/projected/90cbbf62-df45-490f-a76d-6b24fdfe6aa7-kube-api-access-5hpdx\") pod \"cinder-operator-controller-manager-859cd486d-5b4mb\" (UID: \"90cbbf62-df45-490f-a76d-6b24fdfe6aa7\") " pod="openstack-operators/cinder-operator-controller-manager-859cd486d-5b4mb" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.432723 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54fbbfcd44-jpgjv"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.435851 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-75f8d67d86-xv465"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.436707 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-xv465" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.439516 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-5rf8g" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.445670 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-75f8d67d86-xv465"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.459597 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.460466 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.462636 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-84c745747f-p85rq"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.462870 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.463215 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-krwbv" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.463362 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-p85rq" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.464214 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-vqr9w" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.466231 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-84c745747f-p85rq"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.473836 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.481179 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-598c4c8547-r2qcj"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.482363 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-r2qcj" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.484591 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-2jm4r" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.488421 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-598c4c8547-r2qcj"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.531715 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kf7f\" (UniqueName: \"kubernetes.io/projected/458e49e3-37fa-4fde-b98e-35f6490ad3bc-kube-api-access-9kf7f\") pod \"heat-operator-controller-manager-5b4fc86755-xqp2m\" (UID: \"458e49e3-37fa-4fde-b98e-35f6490ad3bc\") " pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-xqp2m" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.531769 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xztnf\" (UniqueName: \"kubernetes.io/projected/e49d1086-42df-449b-97b8-787edd49ba23-kube-api-access-xztnf\") pod \"manila-operator-controller-manager-b7cf8cb5f-x6hzs\" (UID: \"e49d1086-42df-449b-97b8-787edd49ba23\") " pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-x6hzs" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.531799 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkg95\" (UniqueName: \"kubernetes.io/projected/bafce6df-ad32-4af9-9e38-d6da16305ee9-kube-api-access-jkg95\") pod \"keystone-operator-controller-manager-59d7dc95cf-jf2qh\" (UID: \"bafce6df-ad32-4af9-9e38-d6da16305ee9\") " pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-jf2qh" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.531820 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9vqh\" (UniqueName: \"kubernetes.io/projected/12f9e929-b1eb-4dd9-a686-0154f89b5dfc-kube-api-access-z9vqh\") pod \"neutron-operator-controller-manager-54fbbfcd44-jpgjv\" (UID: \"12f9e929-b1eb-4dd9-a686-0154f89b5dfc\") " pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-jpgjv" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.531844 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/903d28be-eebf-4dd0-bd15-3f3e2a9416bf-cert\") pod \"infra-operator-controller-manager-5c8fdc4d5c-5vzvp\" (UID: \"903d28be-eebf-4dd0-bd15-3f3e2a9416bf\") " pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.531859 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4585\" (UniqueName: \"kubernetes.io/projected/8c50728d-dcc7-4cc1-a0fb-ef8c1355d6f9-kube-api-access-j4585\") pod \"ironic-operator-controller-manager-5f45cd594f-m99xq\" (UID: \"8c50728d-dcc7-4cc1-a0fb-ef8c1355d6f9\") " pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-m99xq" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.531884 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mv8d\" (UniqueName: \"kubernetes.io/projected/e229888c-d55e-4b2f-a53a-e588868f98e2-kube-api-access-9mv8d\") pod \"designate-operator-controller-manager-77fb7bcf5b-4rqms\" (UID: \"e229888c-d55e-4b2f-a53a-e588868f98e2\") " pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-4rqms" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.531900 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn7bj\" (UniqueName: \"kubernetes.io/projected/71792270-25e9-4028-b306-c235e6378802-kube-api-access-mn7bj\") pod \"nova-operator-controller-manager-7fd5b6bbc6-8cfbh\" (UID: \"71792270-25e9-4028-b306-c235e6378802\") " pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-8cfbh" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.531917 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsmwc\" (UniqueName: \"kubernetes.io/projected/0872cd5d-1fde-4b27-bdb7-6eade27cee9d-kube-api-access-fsmwc\") pod \"barbican-operator-controller-manager-f7f98cb69-dkqvh\" (UID: \"0872cd5d-1fde-4b27-bdb7-6eade27cee9d\") " pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-dkqvh" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.531942 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsdqs\" (UniqueName: \"kubernetes.io/projected/9c27556b-9c6e-4a2d-9c2d-78f471392a85-kube-api-access-lsdqs\") pod \"horizon-operator-controller-manager-679b4759bb-xcd8d\" (UID: \"9c27556b-9c6e-4a2d-9c2d-78f471392a85\") " pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-xcd8d" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.531969 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bzgc\" (UniqueName: \"kubernetes.io/projected/903d28be-eebf-4dd0-bd15-3f3e2a9416bf-kube-api-access-5bzgc\") pod \"infra-operator-controller-manager-5c8fdc4d5c-5vzvp\" (UID: \"903d28be-eebf-4dd0-bd15-3f3e2a9416bf\") " pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.531993 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcwn7\" (UniqueName: \"kubernetes.io/projected/40d01a1d-0613-43eb-824b-24b22a879822-kube-api-access-lcwn7\") pod \"glance-operator-controller-manager-8bc4775b5-w6s6f\" (UID: \"40d01a1d-0613-43eb-824b-24b22a879822\") " pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-w6s6f" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.532008 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xftvf\" (UniqueName: \"kubernetes.io/projected/f496c1c2-326e-4429-b192-07a5ca33b28d-kube-api-access-xftvf\") pod \"octavia-operator-controller-manager-75f8d67d86-xv465\" (UID: \"f496c1c2-326e-4429-b192-07a5ca33b28d\") " pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-xv465" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.532034 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hpdx\" (UniqueName: \"kubernetes.io/projected/90cbbf62-df45-490f-a76d-6b24fdfe6aa7-kube-api-access-5hpdx\") pod \"cinder-operator-controller-manager-859cd486d-5b4mb\" (UID: \"90cbbf62-df45-490f-a76d-6b24fdfe6aa7\") " pod="openstack-operators/cinder-operator-controller-manager-859cd486d-5b4mb" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.532050 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbkbd\" (UniqueName: \"kubernetes.io/projected/3d6bf83e-81f7-4c56-994f-1058c8ddbe74-kube-api-access-sbkbd\") pod \"mariadb-operator-controller-manager-67bf5bb885-7xngz\" (UID: \"3d6bf83e-81f7-4c56-994f-1058c8ddbe74\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-7xngz" Oct 02 06:58:17 crc kubenswrapper[4786]: E1002 06:58:17.532146 4786 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 02 06:58:17 crc kubenswrapper[4786]: E1002 06:58:17.532181 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/903d28be-eebf-4dd0-bd15-3f3e2a9416bf-cert podName:903d28be-eebf-4dd0-bd15-3f3e2a9416bf nodeName:}" failed. No retries permitted until 2025-10-02 06:58:18.032167293 +0000 UTC m=+708.153350425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/903d28be-eebf-4dd0-bd15-3f3e2a9416bf-cert") pod "infra-operator-controller-manager-5c8fdc4d5c-5vzvp" (UID: "903d28be-eebf-4dd0-bd15-3f3e2a9416bf") : secret "infra-operator-webhook-server-cert" not found Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.544945 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-689b4f76c9-gslcx"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.545880 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-gslcx" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.548279 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-689b4f76c9-gslcx"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.549273 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mv8d\" (UniqueName: \"kubernetes.io/projected/e229888c-d55e-4b2f-a53a-e588868f98e2-kube-api-access-9mv8d\") pod \"designate-operator-controller-manager-77fb7bcf5b-4rqms\" (UID: \"e229888c-d55e-4b2f-a53a-e588868f98e2\") " pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-4rqms" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.549272 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hpdx\" (UniqueName: \"kubernetes.io/projected/90cbbf62-df45-490f-a76d-6b24fdfe6aa7-kube-api-access-5hpdx\") pod \"cinder-operator-controller-manager-859cd486d-5b4mb\" (UID: \"90cbbf62-df45-490f-a76d-6b24fdfe6aa7\") " pod="openstack-operators/cinder-operator-controller-manager-859cd486d-5b4mb" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.549636 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4585\" (UniqueName: \"kubernetes.io/projected/8c50728d-dcc7-4cc1-a0fb-ef8c1355d6f9-kube-api-access-j4585\") pod \"ironic-operator-controller-manager-5f45cd594f-m99xq\" (UID: \"8c50728d-dcc7-4cc1-a0fb-ef8c1355d6f9\") " pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-m99xq" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.550587 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-r92z4" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.550704 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcwn7\" (UniqueName: \"kubernetes.io/projected/40d01a1d-0613-43eb-824b-24b22a879822-kube-api-access-lcwn7\") pod \"glance-operator-controller-manager-8bc4775b5-w6s6f\" (UID: \"40d01a1d-0613-43eb-824b-24b22a879822\") " pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-w6s6f" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.552072 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kf7f\" (UniqueName: \"kubernetes.io/projected/458e49e3-37fa-4fde-b98e-35f6490ad3bc-kube-api-access-9kf7f\") pod \"heat-operator-controller-manager-5b4fc86755-xqp2m\" (UID: \"458e49e3-37fa-4fde-b98e-35f6490ad3bc\") " pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-xqp2m" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.553616 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsdqs\" (UniqueName: \"kubernetes.io/projected/9c27556b-9c6e-4a2d-9c2d-78f471392a85-kube-api-access-lsdqs\") pod \"horizon-operator-controller-manager-679b4759bb-xcd8d\" (UID: \"9c27556b-9c6e-4a2d-9c2d-78f471392a85\") " pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-xcd8d" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.553970 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsmwc\" (UniqueName: \"kubernetes.io/projected/0872cd5d-1fde-4b27-bdb7-6eade27cee9d-kube-api-access-fsmwc\") pod \"barbican-operator-controller-manager-f7f98cb69-dkqvh\" (UID: \"0872cd5d-1fde-4b27-bdb7-6eade27cee9d\") " pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-dkqvh" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.562303 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-5b4mb" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.564418 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bzgc\" (UniqueName: \"kubernetes.io/projected/903d28be-eebf-4dd0-bd15-3f3e2a9416bf-kube-api-access-5bzgc\") pod \"infra-operator-controller-manager-5c8fdc4d5c-5vzvp\" (UID: \"903d28be-eebf-4dd0-bd15-3f3e2a9416bf\") " pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.572332 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-dkqvh" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.595940 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-4rqms" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.605260 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-w6s6f" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.614314 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-xqp2m" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.636043 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck8qn\" (UniqueName: \"kubernetes.io/projected/8e441bad-8657-4e25-a66b-caf4fc28ae8a-kube-api-access-ck8qn\") pod \"swift-operator-controller-manager-689b4f76c9-gslcx\" (UID: \"8e441bad-8657-4e25-a66b-caf4fc28ae8a\") " pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-gslcx" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.636286 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56dd0e8b-9081-4dbc-8683-a0da0f1be122-cert\") pod \"openstack-baremetal-operator-controller-manager-787874f5b7tcwtm\" (UID: \"56dd0e8b-9081-4dbc-8683-a0da0f1be122\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.636309 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xztnf\" (UniqueName: \"kubernetes.io/projected/e49d1086-42df-449b-97b8-787edd49ba23-kube-api-access-xztnf\") pod \"manila-operator-controller-manager-b7cf8cb5f-x6hzs\" (UID: \"e49d1086-42df-449b-97b8-787edd49ba23\") " pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-x6hzs" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.636328 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkg95\" (UniqueName: \"kubernetes.io/projected/bafce6df-ad32-4af9-9e38-d6da16305ee9-kube-api-access-jkg95\") pod \"keystone-operator-controller-manager-59d7dc95cf-jf2qh\" (UID: \"bafce6df-ad32-4af9-9e38-d6da16305ee9\") " pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-jf2qh" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.636346 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9vqh\" (UniqueName: \"kubernetes.io/projected/12f9e929-b1eb-4dd9-a686-0154f89b5dfc-kube-api-access-z9vqh\") pod \"neutron-operator-controller-manager-54fbbfcd44-jpgjv\" (UID: \"12f9e929-b1eb-4dd9-a686-0154f89b5dfc\") " pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-jpgjv" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.636365 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h6mz\" (UniqueName: \"kubernetes.io/projected/7aaa7a8a-2599-4f27-9bce-1fcee450fbfa-kube-api-access-6h6mz\") pod \"ovn-operator-controller-manager-84c745747f-p85rq\" (UID: \"7aaa7a8a-2599-4f27-9bce-1fcee450fbfa\") " pod="openstack-operators/ovn-operator-controller-manager-84c745747f-p85rq" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.636399 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn7bj\" (UniqueName: \"kubernetes.io/projected/71792270-25e9-4028-b306-c235e6378802-kube-api-access-mn7bj\") pod \"nova-operator-controller-manager-7fd5b6bbc6-8cfbh\" (UID: \"71792270-25e9-4028-b306-c235e6378802\") " pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-8cfbh" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.636422 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgt62\" (UniqueName: \"kubernetes.io/projected/ed52f44e-1a43-4d07-a613-0d5d7c367a51-kube-api-access-fgt62\") pod \"placement-operator-controller-manager-598c4c8547-r2qcj\" (UID: \"ed52f44e-1a43-4d07-a613-0d5d7c367a51\") " pod="openstack-operators/placement-operator-controller-manager-598c4c8547-r2qcj" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.636438 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75txk\" (UniqueName: \"kubernetes.io/projected/56dd0e8b-9081-4dbc-8683-a0da0f1be122-kube-api-access-75txk\") pod \"openstack-baremetal-operator-controller-manager-787874f5b7tcwtm\" (UID: \"56dd0e8b-9081-4dbc-8683-a0da0f1be122\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.636497 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xftvf\" (UniqueName: \"kubernetes.io/projected/f496c1c2-326e-4429-b192-07a5ca33b28d-kube-api-access-xftvf\") pod \"octavia-operator-controller-manager-75f8d67d86-xv465\" (UID: \"f496c1c2-326e-4429-b192-07a5ca33b28d\") " pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-xv465" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.636525 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbkbd\" (UniqueName: \"kubernetes.io/projected/3d6bf83e-81f7-4c56-994f-1058c8ddbe74-kube-api-access-sbkbd\") pod \"mariadb-operator-controller-manager-67bf5bb885-7xngz\" (UID: \"3d6bf83e-81f7-4c56-994f-1058c8ddbe74\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-7xngz" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.641371 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-cb66d6b59-89t2n"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.642304 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-89t2n" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.643534 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-xcd8d" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.652746 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-cb66d6b59-89t2n"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.653139 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-m99xq" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.660449 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-6fn7c" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.669314 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn7bj\" (UniqueName: \"kubernetes.io/projected/71792270-25e9-4028-b306-c235e6378802-kube-api-access-mn7bj\") pod \"nova-operator-controller-manager-7fd5b6bbc6-8cfbh\" (UID: \"71792270-25e9-4028-b306-c235e6378802\") " pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-8cfbh" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.669812 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9vqh\" (UniqueName: \"kubernetes.io/projected/12f9e929-b1eb-4dd9-a686-0154f89b5dfc-kube-api-access-z9vqh\") pod \"neutron-operator-controller-manager-54fbbfcd44-jpgjv\" (UID: \"12f9e929-b1eb-4dd9-a686-0154f89b5dfc\") " pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-jpgjv" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.682167 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkg95\" (UniqueName: \"kubernetes.io/projected/bafce6df-ad32-4af9-9e38-d6da16305ee9-kube-api-access-jkg95\") pod \"keystone-operator-controller-manager-59d7dc95cf-jf2qh\" (UID: \"bafce6df-ad32-4af9-9e38-d6da16305ee9\") " pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-jf2qh" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.682305 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xztnf\" (UniqueName: \"kubernetes.io/projected/e49d1086-42df-449b-97b8-787edd49ba23-kube-api-access-xztnf\") pod \"manila-operator-controller-manager-b7cf8cb5f-x6hzs\" (UID: \"e49d1086-42df-449b-97b8-787edd49ba23\") " pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-x6hzs" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.682648 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbkbd\" (UniqueName: \"kubernetes.io/projected/3d6bf83e-81f7-4c56-994f-1058c8ddbe74-kube-api-access-sbkbd\") pod \"mariadb-operator-controller-manager-67bf5bb885-7xngz\" (UID: \"3d6bf83e-81f7-4c56-994f-1058c8ddbe74\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-7xngz" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.683981 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xftvf\" (UniqueName: \"kubernetes.io/projected/f496c1c2-326e-4429-b192-07a5ca33b28d-kube-api-access-xftvf\") pod \"octavia-operator-controller-manager-75f8d67d86-xv465\" (UID: \"f496c1c2-326e-4429-b192-07a5ca33b28d\") " pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-xv465" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.686347 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-jf2qh" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.714666 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-x6hzs" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.719602 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-7xngz" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.739445 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck8qn\" (UniqueName: \"kubernetes.io/projected/8e441bad-8657-4e25-a66b-caf4fc28ae8a-kube-api-access-ck8qn\") pod \"swift-operator-controller-manager-689b4f76c9-gslcx\" (UID: \"8e441bad-8657-4e25-a66b-caf4fc28ae8a\") " pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-gslcx" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.739510 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56dd0e8b-9081-4dbc-8683-a0da0f1be122-cert\") pod \"openstack-baremetal-operator-controller-manager-787874f5b7tcwtm\" (UID: \"56dd0e8b-9081-4dbc-8683-a0da0f1be122\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.739540 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h6mz\" (UniqueName: \"kubernetes.io/projected/7aaa7a8a-2599-4f27-9bce-1fcee450fbfa-kube-api-access-6h6mz\") pod \"ovn-operator-controller-manager-84c745747f-p85rq\" (UID: \"7aaa7a8a-2599-4f27-9bce-1fcee450fbfa\") " pod="openstack-operators/ovn-operator-controller-manager-84c745747f-p85rq" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.739582 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgt62\" (UniqueName: \"kubernetes.io/projected/ed52f44e-1a43-4d07-a613-0d5d7c367a51-kube-api-access-fgt62\") pod \"placement-operator-controller-manager-598c4c8547-r2qcj\" (UID: \"ed52f44e-1a43-4d07-a613-0d5d7c367a51\") " pod="openstack-operators/placement-operator-controller-manager-598c4c8547-r2qcj" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.739603 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvhcj\" (UniqueName: \"kubernetes.io/projected/ae767b5f-e700-4253-be42-48f90b9fe99e-kube-api-access-fvhcj\") pod \"telemetry-operator-controller-manager-cb66d6b59-89t2n\" (UID: \"ae767b5f-e700-4253-be42-48f90b9fe99e\") " pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-89t2n" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.739623 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75txk\" (UniqueName: \"kubernetes.io/projected/56dd0e8b-9081-4dbc-8683-a0da0f1be122-kube-api-access-75txk\") pod \"openstack-baremetal-operator-controller-manager-787874f5b7tcwtm\" (UID: \"56dd0e8b-9081-4dbc-8683-a0da0f1be122\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm" Oct 02 06:58:17 crc kubenswrapper[4786]: E1002 06:58:17.740042 4786 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 02 06:58:17 crc kubenswrapper[4786]: E1002 06:58:17.740095 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56dd0e8b-9081-4dbc-8683-a0da0f1be122-cert podName:56dd0e8b-9081-4dbc-8683-a0da0f1be122 nodeName:}" failed. No retries permitted until 2025-10-02 06:58:18.240080904 +0000 UTC m=+708.361264035 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/56dd0e8b-9081-4dbc-8683-a0da0f1be122-cert") pod "openstack-baremetal-operator-controller-manager-787874f5b7tcwtm" (UID: "56dd0e8b-9081-4dbc-8683-a0da0f1be122") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.743744 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-jpgjv" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.751759 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-8cfbh" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.755564 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-cbdf6dc66-phjsn"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.756852 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-xv465" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.757224 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-phjsn" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.768147 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgt62\" (UniqueName: \"kubernetes.io/projected/ed52f44e-1a43-4d07-a613-0d5d7c367a51-kube-api-access-fgt62\") pod \"placement-operator-controller-manager-598c4c8547-r2qcj\" (UID: \"ed52f44e-1a43-4d07-a613-0d5d7c367a51\") " pod="openstack-operators/placement-operator-controller-manager-598c4c8547-r2qcj" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.769661 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h6mz\" (UniqueName: \"kubernetes.io/projected/7aaa7a8a-2599-4f27-9bce-1fcee450fbfa-kube-api-access-6h6mz\") pod \"ovn-operator-controller-manager-84c745747f-p85rq\" (UID: \"7aaa7a8a-2599-4f27-9bce-1fcee450fbfa\") " pod="openstack-operators/ovn-operator-controller-manager-84c745747f-p85rq" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.769813 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-vhwzx" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.775303 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck8qn\" (UniqueName: \"kubernetes.io/projected/8e441bad-8657-4e25-a66b-caf4fc28ae8a-kube-api-access-ck8qn\") pod \"swift-operator-controller-manager-689b4f76c9-gslcx\" (UID: \"8e441bad-8657-4e25-a66b-caf4fc28ae8a\") " pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-gslcx" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.779004 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-cbdf6dc66-phjsn"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.779820 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75txk\" (UniqueName: \"kubernetes.io/projected/56dd0e8b-9081-4dbc-8683-a0da0f1be122-kube-api-access-75txk\") pod \"openstack-baremetal-operator-controller-manager-787874f5b7tcwtm\" (UID: \"56dd0e8b-9081-4dbc-8683-a0da0f1be122\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.779814 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-p85rq" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.802140 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-r2qcj" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.838668 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-68d7bc5569-6n59c"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.841500 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llz6b\" (UniqueName: \"kubernetes.io/projected/fcbaee3e-a527-4eb0-9c2b-7ada804e7920-kube-api-access-llz6b\") pod \"test-operator-controller-manager-cbdf6dc66-phjsn\" (UID: \"fcbaee3e-a527-4eb0-9c2b-7ada804e7920\") " pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-phjsn" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.841592 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvhcj\" (UniqueName: \"kubernetes.io/projected/ae767b5f-e700-4253-be42-48f90b9fe99e-kube-api-access-fvhcj\") pod \"telemetry-operator-controller-manager-cb66d6b59-89t2n\" (UID: \"ae767b5f-e700-4253-be42-48f90b9fe99e\") " pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-89t2n" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.842553 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-6n59c" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.848477 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-chb4s" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.853764 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-68d7bc5569-6n59c"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.860936 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvhcj\" (UniqueName: \"kubernetes.io/projected/ae767b5f-e700-4253-be42-48f90b9fe99e-kube-api-access-fvhcj\") pod \"telemetry-operator-controller-manager-cb66d6b59-89t2n\" (UID: \"ae767b5f-e700-4253-be42-48f90b9fe99e\") " pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-89t2n" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.948251 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-gslcx" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.948584 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d72s9\" (UniqueName: \"kubernetes.io/projected/810706c5-80b3-47b8-8058-2b0aa1665942-kube-api-access-d72s9\") pod \"watcher-operator-controller-manager-68d7bc5569-6n59c\" (UID: \"810706c5-80b3-47b8-8058-2b0aa1665942\") " pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-6n59c" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.948644 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llz6b\" (UniqueName: \"kubernetes.io/projected/fcbaee3e-a527-4eb0-9c2b-7ada804e7920-kube-api-access-llz6b\") pod \"test-operator-controller-manager-cbdf6dc66-phjsn\" (UID: \"fcbaee3e-a527-4eb0-9c2b-7ada804e7920\") " pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-phjsn" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.956469 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-67698bcd47-cjhhk"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.957309 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-67698bcd47-cjhhk" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.960844 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-sp9jb" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.961023 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.987824 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-67698bcd47-cjhhk"] Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.987921 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-89t2n" Oct 02 06:58:17 crc kubenswrapper[4786]: I1002 06:58:17.994228 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llz6b\" (UniqueName: \"kubernetes.io/projected/fcbaee3e-a527-4eb0-9c2b-7ada804e7920-kube-api-access-llz6b\") pod \"test-operator-controller-manager-cbdf6dc66-phjsn\" (UID: \"fcbaee3e-a527-4eb0-9c2b-7ada804e7920\") " pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-phjsn" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.049259 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx8xw\" (UniqueName: \"kubernetes.io/projected/8830c499-339f-4252-8f35-13671a0b6687-kube-api-access-vx8xw\") pod \"openstack-operator-controller-manager-67698bcd47-cjhhk\" (UID: \"8830c499-339f-4252-8f35-13671a0b6687\") " pod="openstack-operators/openstack-operator-controller-manager-67698bcd47-cjhhk" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.049855 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/903d28be-eebf-4dd0-bd15-3f3e2a9416bf-cert\") pod \"infra-operator-controller-manager-5c8fdc4d5c-5vzvp\" (UID: \"903d28be-eebf-4dd0-bd15-3f3e2a9416bf\") " pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.050160 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8830c499-339f-4252-8f35-13671a0b6687-cert\") pod \"openstack-operator-controller-manager-67698bcd47-cjhhk\" (UID: \"8830c499-339f-4252-8f35-13671a0b6687\") " pod="openstack-operators/openstack-operator-controller-manager-67698bcd47-cjhhk" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.050211 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d72s9\" (UniqueName: \"kubernetes.io/projected/810706c5-80b3-47b8-8058-2b0aa1665942-kube-api-access-d72s9\") pod \"watcher-operator-controller-manager-68d7bc5569-6n59c\" (UID: \"810706c5-80b3-47b8-8058-2b0aa1665942\") " pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-6n59c" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.057636 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/903d28be-eebf-4dd0-bd15-3f3e2a9416bf-cert\") pod \"infra-operator-controller-manager-5c8fdc4d5c-5vzvp\" (UID: \"903d28be-eebf-4dd0-bd15-3f3e2a9416bf\") " pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.057706 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5"] Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.058455 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.061813 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-z9pcd" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.069600 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d72s9\" (UniqueName: \"kubernetes.io/projected/810706c5-80b3-47b8-8058-2b0aa1665942-kube-api-access-d72s9\") pod \"watcher-operator-controller-manager-68d7bc5569-6n59c\" (UID: \"810706c5-80b3-47b8-8058-2b0aa1665942\") " pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-6n59c" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.083382 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5"] Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.141032 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-phjsn" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.151650 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8830c499-339f-4252-8f35-13671a0b6687-cert\") pod \"openstack-operator-controller-manager-67698bcd47-cjhhk\" (UID: \"8830c499-339f-4252-8f35-13671a0b6687\") " pod="openstack-operators/openstack-operator-controller-manager-67698bcd47-cjhhk" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.151706 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh6cg\" (UniqueName: \"kubernetes.io/projected/4910bc83-927a-41bb-a4a6-834c09bcbc6e-kube-api-access-fh6cg\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5\" (UID: \"4910bc83-927a-41bb-a4a6-834c09bcbc6e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.151735 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx8xw\" (UniqueName: \"kubernetes.io/projected/8830c499-339f-4252-8f35-13671a0b6687-kube-api-access-vx8xw\") pod \"openstack-operator-controller-manager-67698bcd47-cjhhk\" (UID: \"8830c499-339f-4252-8f35-13671a0b6687\") " pod="openstack-operators/openstack-operator-controller-manager-67698bcd47-cjhhk" Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.151820 4786 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.151893 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8830c499-339f-4252-8f35-13671a0b6687-cert podName:8830c499-339f-4252-8f35-13671a0b6687 nodeName:}" failed. No retries permitted until 2025-10-02 06:58:18.651875156 +0000 UTC m=+708.773058288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8830c499-339f-4252-8f35-13671a0b6687-cert") pod "openstack-operator-controller-manager-67698bcd47-cjhhk" (UID: "8830c499-339f-4252-8f35-13671a0b6687") : secret "webhook-server-cert" not found Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.169176 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx8xw\" (UniqueName: \"kubernetes.io/projected/8830c499-339f-4252-8f35-13671a0b6687-kube-api-access-vx8xw\") pod \"openstack-operator-controller-manager-67698bcd47-cjhhk\" (UID: \"8830c499-339f-4252-8f35-13671a0b6687\") " pod="openstack-operators/openstack-operator-controller-manager-67698bcd47-cjhhk" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.180982 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-6n59c" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.234112 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.246957 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859cd486d-5b4mb"] Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.251558 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-f7f98cb69-dkqvh"] Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.254702 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh6cg\" (UniqueName: \"kubernetes.io/projected/4910bc83-927a-41bb-a4a6-834c09bcbc6e-kube-api-access-fh6cg\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5\" (UID: \"4910bc83-927a-41bb-a4a6-834c09bcbc6e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.254765 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56dd0e8b-9081-4dbc-8683-a0da0f1be122-cert\") pod \"openstack-baremetal-operator-controller-manager-787874f5b7tcwtm\" (UID: \"56dd0e8b-9081-4dbc-8683-a0da0f1be122\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.258771 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56dd0e8b-9081-4dbc-8683-a0da0f1be122-cert\") pod \"openstack-baremetal-operator-controller-manager-787874f5b7tcwtm\" (UID: \"56dd0e8b-9081-4dbc-8683-a0da0f1be122\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm" Oct 02 06:58:18 crc kubenswrapper[4786]: W1002 06:58:18.274765 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90cbbf62_df45_490f_a76d_6b24fdfe6aa7.slice/crio-d599e02eeff88adf36ae5a06ba69d5b3a273a93edb410d835f73ad5282fbfe9f WatchSource:0}: Error finding container d599e02eeff88adf36ae5a06ba69d5b3a273a93edb410d835f73ad5282fbfe9f: Status 404 returned error can't find the container with id d599e02eeff88adf36ae5a06ba69d5b3a273a93edb410d835f73ad5282fbfe9f Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.282368 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh6cg\" (UniqueName: \"kubernetes.io/projected/4910bc83-927a-41bb-a4a6-834c09bcbc6e-kube-api-access-fh6cg\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5\" (UID: \"4910bc83-927a-41bb-a4a6-834c09bcbc6e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.304229 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-dkqvh" event={"ID":"0872cd5d-1fde-4b27-bdb7-6eade27cee9d","Type":"ContainerStarted","Data":"7f4bbd3526728f86e5c7f4162e720ba26035f24f4dae2adcfcfdbcb3ae90e297"} Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.305075 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-5b4mb" event={"ID":"90cbbf62-df45-490f-a76d-6b24fdfe6aa7","Type":"ContainerStarted","Data":"d599e02eeff88adf36ae5a06ba69d5b3a273a93edb410d835f73ad5282fbfe9f"} Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.374075 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.386084 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.419057 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b4fc86755-xqp2m"] Oct 02 06:58:18 crc kubenswrapper[4786]: W1002 06:58:18.430547 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458e49e3_37fa_4fde_b98e_35f6490ad3bc.slice/crio-2bf5c4175645ae64f44d995501551cad28cfe7df46155a504d7d82c9bd21e57d WatchSource:0}: Error finding container 2bf5c4175645ae64f44d995501551cad28cfe7df46155a504d7d82c9bd21e57d: Status 404 returned error can't find the container with id 2bf5c4175645ae64f44d995501551cad28cfe7df46155a504d7d82c9bd21e57d Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.445641 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-77fb7bcf5b-4rqms"] Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.454017 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-679b4759bb-xcd8d"] Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.458934 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8bc4775b5-w6s6f"] Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.461836 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f45cd594f-m99xq"] Oct 02 06:58:18 crc kubenswrapper[4786]: W1002 06:58:18.467879 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40d01a1d_0613_43eb_824b_24b22a879822.slice/crio-a46869e9a552b430dc7d9a05f9ee89468d34332b4849d0384eaffde05ead53f7 WatchSource:0}: Error finding container a46869e9a552b430dc7d9a05f9ee89468d34332b4849d0384eaffde05ead53f7: Status 404 returned error can't find the container with id a46869e9a552b430dc7d9a05f9ee89468d34332b4849d0384eaffde05ead53f7 Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.470406 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-8cfbh"] Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.479256 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-59d7dc95cf-jf2qh"] Oct 02 06:58:18 crc kubenswrapper[4786]: W1002 06:58:18.479399 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c27556b_9c6e_4a2d_9c2d_78f471392a85.slice/crio-acf20403b3a5e92260d088dd546cb4a70897b7f12bde177f3195211b7ffb65ba WatchSource:0}: Error finding container acf20403b3a5e92260d088dd546cb4a70897b7f12bde177f3195211b7ffb65ba: Status 404 returned error can't find the container with id acf20403b3a5e92260d088dd546cb4a70897b7f12bde177f3195211b7ffb65ba Oct 02 06:58:18 crc kubenswrapper[4786]: W1002 06:58:18.486027 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c50728d_dcc7_4cc1_a0fb_ef8c1355d6f9.slice/crio-ac3f9279457f8c6f17f5acb6c0ccf774abb9ad9313b1eb921596fcbf44c21368 WatchSource:0}: Error finding container ac3f9279457f8c6f17f5acb6c0ccf774abb9ad9313b1eb921596fcbf44c21368: Status 404 returned error can't find the container with id ac3f9279457f8c6f17f5acb6c0ccf774abb9ad9313b1eb921596fcbf44c21368 Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.488961 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-b7cf8cb5f-x6hzs"] Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.635292 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-598c4c8547-r2qcj"] Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.642502 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf5bb885-7xngz"] Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.648993 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fgt62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-598c4c8547-r2qcj_openstack-operators(ed52f44e-1a43-4d07-a613-0d5d7c367a51): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.654038 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fvhcj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-cb66d6b59-89t2n_openstack-operators(ae767b5f-e700-4253-be42-48f90b9fe99e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.655782 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-cb66d6b59-89t2n"] Oct 02 06:58:18 crc kubenswrapper[4786]: W1002 06:58:18.656309 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7aaa7a8a_2599_4f27_9bce_1fcee450fbfa.slice/crio-74d9f82fb59c3544bb393941e61847656fedf411f7902315bb4bc463b2bd062e WatchSource:0}: Error finding container 74d9f82fb59c3544bb393941e61847656fedf411f7902315bb4bc463b2bd062e: Status 404 returned error can't find the container with id 74d9f82fb59c3544bb393941e61847656fedf411f7902315bb4bc463b2bd062e Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.658773 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6h6mz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-84c745747f-p85rq_openstack-operators(7aaa7a8a-2599-4f27-9bce-1fcee450fbfa): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.662150 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8830c499-339f-4252-8f35-13671a0b6687-cert\") pod \"openstack-operator-controller-manager-67698bcd47-cjhhk\" (UID: \"8830c499-339f-4252-8f35-13671a0b6687\") " pod="openstack-operators/openstack-operator-controller-manager-67698bcd47-cjhhk" Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.662368 4786 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.662432 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8830c499-339f-4252-8f35-13671a0b6687-cert podName:8830c499-339f-4252-8f35-13671a0b6687 nodeName:}" failed. No retries permitted until 2025-10-02 06:58:19.662398771 +0000 UTC m=+709.783581902 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8830c499-339f-4252-8f35-13671a0b6687-cert") pod "openstack-operator-controller-manager-67698bcd47-cjhhk" (UID: "8830c499-339f-4252-8f35-13671a0b6687") : secret "webhook-server-cert" not found Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.662966 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-84c745747f-p85rq"] Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.677941 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xftvf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-75f8d67d86-xv465_openstack-operators(f496c1c2-326e-4429-b192-07a5ca33b28d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.680347 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z9vqh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-54fbbfcd44-jpgjv_openstack-operators(12f9e929-b1eb-4dd9-a686-0154f89b5dfc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.682450 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-75f8d67d86-xv465"] Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.689763 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54fbbfcd44-jpgjv"] Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.810595 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-p85rq" podUID="7aaa7a8a-2599-4f27-9bce-1fcee450fbfa" Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.811476 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-r2qcj" podUID="ed52f44e-1a43-4d07-a613-0d5d7c367a51" Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.814156 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-89t2n" podUID="ae767b5f-e700-4253-be42-48f90b9fe99e" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.827485 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-68d7bc5569-6n59c"] Oct 02 06:58:18 crc kubenswrapper[4786]: W1002 06:58:18.830910 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod810706c5_80b3_47b8_8058_2b0aa1665942.slice/crio-bb17c19d2173a4c4e25461423f02c88e55e31363286799640ac2c525e1e9f5fa WatchSource:0}: Error finding container bb17c19d2173a4c4e25461423f02c88e55e31363286799640ac2c525e1e9f5fa: Status 404 returned error can't find the container with id bb17c19d2173a4c4e25461423f02c88e55e31363286799640ac2c525e1e9f5fa Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.835293 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-689b4f76c9-gslcx"] Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.835332 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-xv465" podUID="f496c1c2-326e-4429-b192-07a5ca33b28d" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.839599 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp"] Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.843752 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-cbdf6dc66-phjsn"] Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.844156 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ck8qn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-689b4f76c9-gslcx_openstack-operators(8e441bad-8657-4e25-a66b-caf4fc28ae8a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 06:58:18 crc kubenswrapper[4786]: W1002 06:58:18.847400 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod903d28be_eebf_4dd0_bd15_3f3e2a9416bf.slice/crio-c9782706d2260fc8b29a1fce9f5eeb86e2ca33f77a0f065f229acc460a667d6a WatchSource:0}: Error finding container c9782706d2260fc8b29a1fce9f5eeb86e2ca33f77a0f065f229acc460a667d6a: Status 404 returned error can't find the container with id c9782706d2260fc8b29a1fce9f5eeb86e2ca33f77a0f065f229acc460a667d6a Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.853294 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5bzgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-5c8fdc4d5c-5vzvp_openstack-operators(903d28be-eebf-4dd0-bd15-3f3e2a9416bf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.855008 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5"] Oct 02 06:58:18 crc kubenswrapper[4786]: I1002 06:58:18.859093 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm"] Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.861802 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fh6cg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5_openstack-operators(4910bc83-927a-41bb-a4a6-834c09bcbc6e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.862778 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-llz6b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-cbdf6dc66-phjsn_openstack-operators(fcbaee3e-a527-4eb0-9c2b-7ada804e7920): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.863082 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-jpgjv" podUID="12f9e929-b1eb-4dd9-a686-0154f89b5dfc" Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.863088 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5" podUID="4910bc83-927a-41bb-a4a6-834c09bcbc6e" Oct 02 06:58:18 crc kubenswrapper[4786]: W1002 06:58:18.865029 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56dd0e8b_9081_4dbc_8683_a0da0f1be122.slice/crio-d36cae8c7772af721491a1c4b661ecb85f2af9a22f10254207ecaaa235a0eca8 WatchSource:0}: Error finding container d36cae8c7772af721491a1c4b661ecb85f2af9a22f10254207ecaaa235a0eca8: Status 404 returned error can't find the container with id d36cae8c7772af721491a1c4b661ecb85f2af9a22f10254207ecaaa235a0eca8 Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.873098 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent@sha256:fe3439557337d51c30cf1302608e5fc4623b3598ed5c49e9699f13e5abef61cc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:1c99923410d4cd0a721d2cc8a51d91d3ac800d5fda508c972ebe1e85ed2ca4d0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api@sha256:518df53777513ae6af94ff6a267d12981421525b27a5ab5042c2ac95967aa36d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator@sha256:a5c05d9a996cfc9fdd05454149255f6d72b165369cda4f5cd8f56d4c60245f1a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener@sha256:8f3b6c2ffde0d9678504683f2a788dafa0834c60f3d325b05bfdae855b0f5f98,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier@sha256:a05f003cb0c8a5d3ae42d62cb8f06a494a819d801c5e77ebcfb684c9af1238f0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:4da48d8f16bfe4f35784b91b85bbd936c35ed26274a3991c54a19e6520c41c3c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener@sha256:6362514c29a178610a6efabc57c7e05de596cd65279d4ce1faaf7114e3ca9926,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker@sha256:510e66238e579d31e3fe9e05bf4ea31b1ddc0c1e4a28fe6355a69d0ab9ccac81,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:17a5a8c217d3344d7fbe8dd1b01a4a99194aad59c2683ccd39d077f528c0880e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:c5e9470dcf2202c26ba7112c5aa4803d550367dd8a5f86bbeca6ff8cb404076b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi@sha256:645d4fca54f09ffba365383ac467cbc0fef04363dd22f9ab2f771721ef557705,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter@sha256:7211a617ec657701ca819aa0ba28e1d5750f5bf2c1391b755cc4a48cc360b0fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:eac564bfb8a13e2bcd4656ad8e4b45eb57536bd32ed8a33c0945b1fc60414c9d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:dec0378fb73f66dd39f6d7f7bdc6020d9bdf4c57c201eac3c63d7e97667627fe,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup@sha256:64d895c5efce47a84c9d4a46a292441d28e5581599ef7cdf8a0a576d2a18288d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler@sha256:ace365ee06fadfe22f47f3b243d23021ba034d8456c517872db7e4f2965756c5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume@sha256:010a0936d4dee93552033421bcb26d7ae50bea00f2b59fd3cb2da05991cbb5d2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api@sha256:229d12cd24aabc6b322c3b161c227c8ebc2692f4243b50a6adaa91bb230d30bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9@sha256:709ea8ccc05ea65e20f9a8eb0ae1020d5524d9ae25123fe8b1f704ad98683bc9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central@sha256:31f972c87db65413ce2dcf47f25d975f231feda6bedddf33bae9ae2d0eba173e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns@sha256:b92f7aea6dde34128b0202bce479b3a2d92cf3cbf2a9055c99ece3f995158e03,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer@sha256:80f9aa236ab23cc06dc2f42f8033b5851fa29bf1e3ee4961f4bf9aec78dc22fd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound@sha256:453f4e8e8319bbe843e6d78f2c18d98d9ea170ca6541931f5107981cd2a32996,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker@sha256:5bc1dc22276aecd72a6fd5d7822ceaac25ffaffbbf16e931029fd0c791054dfa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr@sha256:e992f55a9fbff89bf214ff0c97c73918800c64012443cc76516a361a9986e0d3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid@sha256:ccd28e86a082f716c528dd19339b20b089babe706b9221d03a483bf5e38eaa85,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler@sha256:581b65b646301e0fcb07582150ba63438f1353a85bf9acf1eb2acb4ce71c58bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron@sha256:b00e29030c9666484b2f5baac70e3de0a8cab4f58bd3e270f8fbf18953f97bd6,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd@sha256:e2165a561d61825f8c0fb2c0514e8e828f53fa25fe921e79bb0e4b440b10b136,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent@sha256:b20fa3b0a3a01acd5d506742eaa6500a5e90bb92008ea7fad2e64580cfce9582,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c669f661c7b636aa2018e5e98d826141cf1626e32230a8985e4cfb6a55e6cb3e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent@sha256:badaf9205a7dd15f7fa96302c4f1bfd7bbb4c99167ae9f7bf1424e8e7b05d77a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent@sha256:9dc84b8f1c1db86d71c3d88ae091df6591b72307c2898cf42658b75db6a106a4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent@sha256:e12f1673e4c4bbd77e1375f7cc9ee883c261542317d1d87f113f6091e90cea29,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:85c8d86100270d60e99d2a7ab5e875fa7634a6a8a6c351630fe3b964e1b11f0e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api@sha256:3b539c6ed6c3a6b66a300e26e3d74432f2297279b11ec05fb31f717ebc5f304e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn@sha256:a246e5f81db50cd10949057d047bcddd4add5de011c2b5553d82728e845e27d7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine@sha256:f9f9d3cbaf14d1765dcf3af69c739125ba6b000c490df5f7d1648afe816e1faf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon@sha256:cd4347fc6be340bb0a5d1fa5046f0372d5a2193d4913ed1e8038b4070fae5169,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached@sha256:b47d4eae5e865b2f6e1baf1d2bf66aae8e3a91fb6576a1cf9fcb47e0aefd668a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis@sha256:8e5f7a8e79285c4a6ef668b82b4809210b29a5ace62ee694cad41025c1307086,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api@sha256:b3bd86ea948427d8b82fcdf709727863679a6152d0df70c9074d02886eb68dd6,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor@sha256:843f85a06093397d9e2dfe9fe89fee5403136ea7ee1ca2fc4f62bcc6424fb560,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector@sha256:6373beb6be59c9792754c3c754bdbf3323b043f2802c82410fd082e6f9739004,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent@sha256:05d1262fc9ae9424275f24aa9a18589fb676335ea27a1b438e42d5d18fc15179,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe@sha256:83e473c5007e76c57b1afa32ad0f72173464e92eaf7a924b198af0c6422a7f28,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent@sha256:e1961ac4c17954530517f025e7568ddb7b356d20d4298d826e164254da44b823,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone@sha256:7f9881c69cc601e15662e64fccd927a9b06df2c11f8cafd8ee51d1c2c60446c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api@sha256:2807588bf47b15de6aa29744ad0e7dd7ee3acc5dbaf9bd008b06546cb976805f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler@sha256:186c1e750c5d17bff6c8cc78be273be77d846cb5132bdb3d13e4c70ce04ec41c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share@sha256:1d821861636444ef495e6fbf0f4a767228eefedc4fe51c35ed782687d44fd8fd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0d8e78a9369adac95777be7f2b7ed111f93bc5936b6f284fb1537dc7eae7192a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils@sha256:87cb6a91c3a9dd4c487e99e9afd38abbca59dcae337f5fec5071c6f92007205d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:88f213718cf92b20884afce5905e9a604786648e459a80f0c95b2d2205f6e055,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api@sha256:a67eff0f7040b234c4d856a076f1df98f3a3d0d2ef02fd499ec580c72c35a57b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:bffa691f6d884b143ff2e1ec18bf26fdfcea39492c68874b12a41aab94fdde38,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:f2074d21cd1928f96272477bf99dfb87ca34a700ddc06754597670b141cf4759,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy@sha256:b8f132147073a1b2574908075c99ec6ad5cefe6243ae9fe64af2738bd1d01f4e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler@sha256:c6a0cc0b86366e6436eaa3600e709d9bed8929f94884eeea19be87867ed4e027,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api@sha256:52498394257731b2e2b61b813434a9cc9b39b2b6bcfaee3c4764ee716807761c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager@sha256:eb839898ec6bba6e9d552ab64817f49485461b7ebceae1b121002a6e25836ccf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping@sha256:3f39dceb7272639abe06fe1c218936d8068dfca721c6413bc91868fb77d88720,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog@sha256:663ff44c4016c5a34460f30ed7f742b8a765ba5cc81897dcf5009d3062637626,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker@sha256:0e366f865710291a234fb316973854b75946e8ed35dd47353b49de2107ead77c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:a336df8914f0bf065c3f2dd3641c5f96011b53cc504e1a5518979be25bbfc204,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather@sha256:bf42dfd2e225818662aa28c4bb23204dc47b2b91127ca0e49b085baa1ea7609d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi@sha256:e034577b9d0781f4548522a99ff2d13f205e67a30d0fc150c7cd0844e6752277,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:a2ac7724a8631b0e30a36b14bfd96711af16439a740cea5fde1405253e9798a0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base@sha256:f02b83cbbef5604a7387575dad8a299ccb6afa13069dfe3eeec7acd0fda940b2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server@sha256:9229a115d027f5a6576f50870d6fb80d428064662043b3fd7ee3972ee26614c4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:1c50a166bd99e649ad7da7fd5e997b4059e19a6c98a3a8fd4edf3595c15f18c7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server@sha256:27296ea33566d09733c1122257994f7ab6caf3fd73758c710c81009c12430844,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:d88750a3464a2b6dd3bfcfc7222a9579b2aebdf23014ce835ad0b0d8492d0ad9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:aa5f2679b2241034703c548052aa73f1789a0dbc34a87c1cf8e63744f99fac1f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account@sha256:6d735d97892f6959ead0564ad0d3efc503fe6575f65aed42d7c074a8a99ef08c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container@sha256:33712c48a2f32c3c7db4d1d77719599fb5619f97d21471ae4b7602be9f5841e2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object@sha256:2bdfac03b30721b21964bce0b99b45d7f733a3b70104f4afdf4fdc7166aff109,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server@sha256:4618ac84908528f1203c51515538e5f68bbff19fc3934543d7f89a358a618941,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all@sha256:bd0ba26707f4b3cc5e98d7a6d6ae05617d47a87f41d89dcbb79a668f9063b963,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api@sha256:ace1c9bb647fbbac08c2b1e674060962fcba9366b0c6b6bf9c5caad7c80844d1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier@sha256:f0f7b03511d721500f524c1c8a069e9948569b851b91f96c276cae593dd28425,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine@sha256:37274b985899c529e98703b76a0b8ca6aac1de7a601788d54e90fdc12d1498f0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-75txk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-787874f5b7tcwtm_openstack-operators(56dd0e8b-9081-4dbc-8683-a0da0f1be122): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 06:58:18 crc kubenswrapper[4786]: E1002 06:58:18.995819 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp" podUID="903d28be-eebf-4dd0-bd15-3f3e2a9416bf" Oct 02 06:58:19 crc kubenswrapper[4786]: E1002 06:58:19.003922 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-phjsn" podUID="fcbaee3e-a527-4eb0-9c2b-7ada804e7920" Oct 02 06:58:19 crc kubenswrapper[4786]: E1002 06:58:19.019432 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-gslcx" podUID="8e441bad-8657-4e25-a66b-caf4fc28ae8a" Oct 02 06:58:19 crc kubenswrapper[4786]: E1002 06:58:19.030924 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm" podUID="56dd0e8b-9081-4dbc-8683-a0da0f1be122" Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.315762 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp" event={"ID":"903d28be-eebf-4dd0-bd15-3f3e2a9416bf","Type":"ContainerStarted","Data":"a2a5df12cd373277a7206e367858fe65852f3e4a92695d2f1790870c61ae4918"} Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.315803 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp" event={"ID":"903d28be-eebf-4dd0-bd15-3f3e2a9416bf","Type":"ContainerStarted","Data":"c9782706d2260fc8b29a1fce9f5eeb86e2ca33f77a0f065f229acc460a667d6a"} Oct 02 06:58:19 crc kubenswrapper[4786]: E1002 06:58:19.321979 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3\\\"\"" pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp" podUID="903d28be-eebf-4dd0-bd15-3f3e2a9416bf" Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.323720 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-89t2n" event={"ID":"ae767b5f-e700-4253-be42-48f90b9fe99e","Type":"ContainerStarted","Data":"52e83a0cf1d95536667026c03671e9ffefbeb373f5af6bcb02c14a737cc951a3"} Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.323761 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-89t2n" event={"ID":"ae767b5f-e700-4253-be42-48f90b9fe99e","Type":"ContainerStarted","Data":"2fc7f4ae889ad781b8a9e382340107f6c6129625c66ecdb9933a71dec5c17300"} Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.326916 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-6n59c" event={"ID":"810706c5-80b3-47b8-8058-2b0aa1665942","Type":"ContainerStarted","Data":"bb17c19d2173a4c4e25461423f02c88e55e31363286799640ac2c525e1e9f5fa"} Oct 02 06:58:19 crc kubenswrapper[4786]: E1002 06:58:19.334541 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-89t2n" podUID="ae767b5f-e700-4253-be42-48f90b9fe99e" Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.335903 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-gslcx" event={"ID":"8e441bad-8657-4e25-a66b-caf4fc28ae8a","Type":"ContainerStarted","Data":"a9c95e59b1b6ed07af4777fb47094d7518f1d4e3e7fca7a666f16eca1d169c76"} Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.335933 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-gslcx" event={"ID":"8e441bad-8657-4e25-a66b-caf4fc28ae8a","Type":"ContainerStarted","Data":"c14111b8442a402b402d072e8cf5330a3990bf2aebb2c1bbe722ba6851973ccf"} Oct 02 06:58:19 crc kubenswrapper[4786]: E1002 06:58:19.336954 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4\\\"\"" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-gslcx" podUID="8e441bad-8657-4e25-a66b-caf4fc28ae8a" Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.342842 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm" event={"ID":"56dd0e8b-9081-4dbc-8683-a0da0f1be122","Type":"ContainerStarted","Data":"16fa054c7a4a5886de5af782ab30aecc33e9e6ce26b535649283c777354e1e2a"} Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.342884 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm" event={"ID":"56dd0e8b-9081-4dbc-8683-a0da0f1be122","Type":"ContainerStarted","Data":"d36cae8c7772af721491a1c4b661ecb85f2af9a22f10254207ecaaa235a0eca8"} Oct 02 06:58:19 crc kubenswrapper[4786]: E1002 06:58:19.346435 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm" podUID="56dd0e8b-9081-4dbc-8683-a0da0f1be122" Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.358517 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-xqp2m" event={"ID":"458e49e3-37fa-4fde-b98e-35f6490ad3bc","Type":"ContainerStarted","Data":"2bf5c4175645ae64f44d995501551cad28cfe7df46155a504d7d82c9bd21e57d"} Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.360577 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-m99xq" event={"ID":"8c50728d-dcc7-4cc1-a0fb-ef8c1355d6f9","Type":"ContainerStarted","Data":"ac3f9279457f8c6f17f5acb6c0ccf774abb9ad9313b1eb921596fcbf44c21368"} Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.364868 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-4rqms" event={"ID":"e229888c-d55e-4b2f-a53a-e588868f98e2","Type":"ContainerStarted","Data":"fda944dda555662dfbdda38f66fab818d82f222469e4d9310d53554209585616"} Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.403490 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-p85rq" event={"ID":"7aaa7a8a-2599-4f27-9bce-1fcee450fbfa","Type":"ContainerStarted","Data":"54f5eb3f781ea9af398b7f2905e731feafd64f91d330087541743bad5379e410"} Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.403546 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-p85rq" event={"ID":"7aaa7a8a-2599-4f27-9bce-1fcee450fbfa","Type":"ContainerStarted","Data":"74d9f82fb59c3544bb393941e61847656fedf411f7902315bb4bc463b2bd062e"} Oct 02 06:58:19 crc kubenswrapper[4786]: E1002 06:58:19.407768 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-p85rq" podUID="7aaa7a8a-2599-4f27-9bce-1fcee450fbfa" Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.407925 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-7xngz" event={"ID":"3d6bf83e-81f7-4c56-994f-1058c8ddbe74","Type":"ContainerStarted","Data":"11ecfd6b65022c1f0df1903a1d94678f1d3a7d69a64f943e9fcc1b2030f57c63"} Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.419917 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5" event={"ID":"4910bc83-927a-41bb-a4a6-834c09bcbc6e","Type":"ContainerStarted","Data":"95c0d4f7dd2c736341b4d768188f71964ee2c0308d833f58812def953adf3d99"} Oct 02 06:58:19 crc kubenswrapper[4786]: E1002 06:58:19.422147 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5" podUID="4910bc83-927a-41bb-a4a6-834c09bcbc6e" Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.445350 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-x6hzs" event={"ID":"e49d1086-42df-449b-97b8-787edd49ba23","Type":"ContainerStarted","Data":"8abc41c9d7930a87d0cdea039d82cf030f7954ee7753b42ada08643ca0278ff8"} Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.451995 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-jpgjv" event={"ID":"12f9e929-b1eb-4dd9-a686-0154f89b5dfc","Type":"ContainerStarted","Data":"a9927c59c1405b2a528e20ae67e46eeb04728556882cc268dc1825b914f50d7b"} Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.452179 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-jpgjv" event={"ID":"12f9e929-b1eb-4dd9-a686-0154f89b5dfc","Type":"ContainerStarted","Data":"48a6c6703bda7da3ff2a6345fd471ad55f6614fb7b754b5c918ffca684616c04"} Oct 02 06:58:19 crc kubenswrapper[4786]: E1002 06:58:19.456618 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-jpgjv" podUID="12f9e929-b1eb-4dd9-a686-0154f89b5dfc" Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.461325 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-phjsn" event={"ID":"fcbaee3e-a527-4eb0-9c2b-7ada804e7920","Type":"ContainerStarted","Data":"7dd0b546c4ea4b0a29e7029b0c42ed07a15d4e843026537011ae9dafde533fc4"} Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.461367 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-phjsn" event={"ID":"fcbaee3e-a527-4eb0-9c2b-7ada804e7920","Type":"ContainerStarted","Data":"4e5271f085e36e78e7185012c37ad473128f9dd33d5f7514ae6ff297263ab93d"} Oct 02 06:58:19 crc kubenswrapper[4786]: E1002 06:58:19.467525 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-phjsn" podUID="fcbaee3e-a527-4eb0-9c2b-7ada804e7920" Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.493232 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-r2qcj" event={"ID":"ed52f44e-1a43-4d07-a613-0d5d7c367a51","Type":"ContainerStarted","Data":"bd860897cc800a8ec04dc57e9c6660fc5521981e01b68a573973dafda00d1dc4"} Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.493279 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-r2qcj" event={"ID":"ed52f44e-1a43-4d07-a613-0d5d7c367a51","Type":"ContainerStarted","Data":"ff6cdd0a59ec62dbc465f36e39e4222769ac5d2215cf0a3cf4827c83a410e8e7"} Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.495999 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-w6s6f" event={"ID":"40d01a1d-0613-43eb-824b-24b22a879822","Type":"ContainerStarted","Data":"a46869e9a552b430dc7d9a05f9ee89468d34332b4849d0384eaffde05ead53f7"} Oct 02 06:58:19 crc kubenswrapper[4786]: E1002 06:58:19.497097 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-r2qcj" podUID="ed52f44e-1a43-4d07-a613-0d5d7c367a51" Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.499157 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-xcd8d" event={"ID":"9c27556b-9c6e-4a2d-9c2d-78f471392a85","Type":"ContainerStarted","Data":"acf20403b3a5e92260d088dd546cb4a70897b7f12bde177f3195211b7ffb65ba"} Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.500590 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-xv465" event={"ID":"f496c1c2-326e-4429-b192-07a5ca33b28d","Type":"ContainerStarted","Data":"78f034108dfa0f5e0faf07147b7f8e8e294df1d74abdffbf0458f06cb398abc2"} Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.500636 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-xv465" event={"ID":"f496c1c2-326e-4429-b192-07a5ca33b28d","Type":"ContainerStarted","Data":"2f92544c44a8f023538ff36b69f3ba925308d496c153b3d156a5b3cf87e5033e"} Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.506753 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-8cfbh" event={"ID":"71792270-25e9-4028-b306-c235e6378802","Type":"ContainerStarted","Data":"aae9031fc39179131d5ae53e7cb31ef711c292a47443ed3afa8eefcc20aaafdf"} Oct 02 06:58:19 crc kubenswrapper[4786]: E1002 06:58:19.506754 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-xv465" podUID="f496c1c2-326e-4429-b192-07a5ca33b28d" Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.517026 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-jf2qh" event={"ID":"bafce6df-ad32-4af9-9e38-d6da16305ee9","Type":"ContainerStarted","Data":"0ce81dee551ffaef2832c1a01c13ccccab63a20d5f71c425bac2999bdbe78a01"} Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.691900 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8830c499-339f-4252-8f35-13671a0b6687-cert\") pod \"openstack-operator-controller-manager-67698bcd47-cjhhk\" (UID: \"8830c499-339f-4252-8f35-13671a0b6687\") " pod="openstack-operators/openstack-operator-controller-manager-67698bcd47-cjhhk" Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.719807 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8830c499-339f-4252-8f35-13671a0b6687-cert\") pod \"openstack-operator-controller-manager-67698bcd47-cjhhk\" (UID: \"8830c499-339f-4252-8f35-13671a0b6687\") " pod="openstack-operators/openstack-operator-controller-manager-67698bcd47-cjhhk" Oct 02 06:58:19 crc kubenswrapper[4786]: I1002 06:58:19.870524 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-67698bcd47-cjhhk" Oct 02 06:58:20 crc kubenswrapper[4786]: I1002 06:58:20.390701 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-67698bcd47-cjhhk"] Oct 02 06:58:20 crc kubenswrapper[4786]: W1002 06:58:20.426917 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8830c499_339f_4252_8f35_13671a0b6687.slice/crio-e4731abd7a31ff1085ce829ac0b9d1fd9f3603181be6112af4f600923f353c1d WatchSource:0}: Error finding container e4731abd7a31ff1085ce829ac0b9d1fd9f3603181be6112af4f600923f353c1d: Status 404 returned error can't find the container with id e4731abd7a31ff1085ce829ac0b9d1fd9f3603181be6112af4f600923f353c1d Oct 02 06:58:20 crc kubenswrapper[4786]: I1002 06:58:20.535411 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-67698bcd47-cjhhk" event={"ID":"8830c499-339f-4252-8f35-13671a0b6687","Type":"ContainerStarted","Data":"e4731abd7a31ff1085ce829ac0b9d1fd9f3603181be6112af4f600923f353c1d"} Oct 02 06:58:20 crc kubenswrapper[4786]: E1002 06:58:20.537102 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-jpgjv" podUID="12f9e929-b1eb-4dd9-a686-0154f89b5dfc" Oct 02 06:58:20 crc kubenswrapper[4786]: E1002 06:58:20.537587 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-xv465" podUID="f496c1c2-326e-4429-b192-07a5ca33b28d" Oct 02 06:58:20 crc kubenswrapper[4786]: E1002 06:58:20.537654 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-r2qcj" podUID="ed52f44e-1a43-4d07-a613-0d5d7c367a51" Oct 02 06:58:20 crc kubenswrapper[4786]: E1002 06:58:20.537675 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-p85rq" podUID="7aaa7a8a-2599-4f27-9bce-1fcee450fbfa" Oct 02 06:58:20 crc kubenswrapper[4786]: E1002 06:58:20.537728 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-phjsn" podUID="fcbaee3e-a527-4eb0-9c2b-7ada804e7920" Oct 02 06:58:20 crc kubenswrapper[4786]: E1002 06:58:20.538343 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm" podUID="56dd0e8b-9081-4dbc-8683-a0da0f1be122" Oct 02 06:58:20 crc kubenswrapper[4786]: E1002 06:58:20.538594 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3\\\"\"" pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp" podUID="903d28be-eebf-4dd0-bd15-3f3e2a9416bf" Oct 02 06:58:20 crc kubenswrapper[4786]: E1002 06:58:20.538672 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-89t2n" podUID="ae767b5f-e700-4253-be42-48f90b9fe99e" Oct 02 06:58:20 crc kubenswrapper[4786]: E1002 06:58:20.538891 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:bca053da8adc37a9a246b478949960ac7abef8fcc0c58a2a45045c59a62b5fe4\\\"\"" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-gslcx" podUID="8e441bad-8657-4e25-a66b-caf4fc28ae8a" Oct 02 06:58:20 crc kubenswrapper[4786]: E1002 06:58:20.539802 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5" podUID="4910bc83-927a-41bb-a4a6-834c09bcbc6e" Oct 02 06:58:21 crc kubenswrapper[4786]: I1002 06:58:21.542777 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-67698bcd47-cjhhk" event={"ID":"8830c499-339f-4252-8f35-13671a0b6687","Type":"ContainerStarted","Data":"7e7ea241863a6990c5d7b55ed9801a8422317c28af95258f887f5e5e2fdaafd2"} Oct 02 06:58:21 crc kubenswrapper[4786]: I1002 06:58:21.543227 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-67698bcd47-cjhhk" event={"ID":"8830c499-339f-4252-8f35-13671a0b6687","Type":"ContainerStarted","Data":"3d2c53eef0e34fef03a7ad9912a403633067a88ad7621633b7c6c0e2cd2789c0"} Oct 02 06:58:21 crc kubenswrapper[4786]: I1002 06:58:21.543245 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-67698bcd47-cjhhk" Oct 02 06:58:21 crc kubenswrapper[4786]: I1002 06:58:21.568882 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-67698bcd47-cjhhk" podStartSLOduration=4.568865568 podStartE2EDuration="4.568865568s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:58:21.565636846 +0000 UTC m=+711.686819997" watchObservedRunningTime="2025-10-02 06:58:21.568865568 +0000 UTC m=+711.690048699" Oct 02 06:58:26 crc kubenswrapper[4786]: I1002 06:58:26.577500 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-m99xq" event={"ID":"8c50728d-dcc7-4cc1-a0fb-ef8c1355d6f9","Type":"ContainerStarted","Data":"50927e4bfc9c8f843474a8a08c5701c32cde9bd8436eca6bb5fd4623f9f423d4"} Oct 02 06:58:26 crc kubenswrapper[4786]: I1002 06:58:26.581148 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-6n59c" event={"ID":"810706c5-80b3-47b8-8058-2b0aa1665942","Type":"ContainerStarted","Data":"6cf6cbc1c6b68c9933ad70478d73acef0e30f6c65b08018c6dca0bd4141b75b8"} Oct 02 06:58:26 crc kubenswrapper[4786]: I1002 06:58:26.589167 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-dkqvh" event={"ID":"0872cd5d-1fde-4b27-bdb7-6eade27cee9d","Type":"ContainerStarted","Data":"65c5fe0ee3b818dfc9e623f2063872405779204fdd81ec0fc27f46d0eee211b9"} Oct 02 06:58:26 crc kubenswrapper[4786]: I1002 06:58:26.592906 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-w6s6f" event={"ID":"40d01a1d-0613-43eb-824b-24b22a879822","Type":"ContainerStarted","Data":"e54c7b87eff53cff33b2a16c1a96f66de7cb9c07a20f26b2a2074c61d0735946"} Oct 02 06:58:26 crc kubenswrapper[4786]: I1002 06:58:26.595822 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-8cfbh" event={"ID":"71792270-25e9-4028-b306-c235e6378802","Type":"ContainerStarted","Data":"9e378e1fe83ff8bb29f05bda6e9e779eaa593c6bba7358cf3f148348969020af"} Oct 02 06:58:26 crc kubenswrapper[4786]: I1002 06:58:26.600081 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-xqp2m" event={"ID":"458e49e3-37fa-4fde-b98e-35f6490ad3bc","Type":"ContainerStarted","Data":"886119fd0cf776879562a1c166e3c0d6fcbae08d4d03e8cacbe9286ea52b8177"} Oct 02 06:58:26 crc kubenswrapper[4786]: I1002 06:58:26.647521 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-4rqms" event={"ID":"e229888c-d55e-4b2f-a53a-e588868f98e2","Type":"ContainerStarted","Data":"2b038e06e64cd57d52a09dd6b9e941361180dd298be8888169c3b6065bb3e382"} Oct 02 06:58:26 crc kubenswrapper[4786]: I1002 06:58:26.661033 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-jf2qh" event={"ID":"bafce6df-ad32-4af9-9e38-d6da16305ee9","Type":"ContainerStarted","Data":"f28ec786ee5b73567d7665e2d920f5ad2c385a2f6ced2003efbc632628a90cd0"} Oct 02 06:58:26 crc kubenswrapper[4786]: I1002 06:58:26.675360 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-5b4mb" event={"ID":"90cbbf62-df45-490f-a76d-6b24fdfe6aa7","Type":"ContainerStarted","Data":"a5270cf225e2e6ed273f40e3b269d558b302cf819b806bd8f61a93e53613017f"} Oct 02 06:58:26 crc kubenswrapper[4786]: I1002 06:58:26.687645 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-x6hzs" event={"ID":"e49d1086-42df-449b-97b8-787edd49ba23","Type":"ContainerStarted","Data":"482c1545b2f25ca5ab55a52014e5e720e4b98387aa0c42dd0efc11d9a5f5c215"} Oct 02 06:58:26 crc kubenswrapper[4786]: I1002 06:58:26.712385 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-xcd8d" event={"ID":"9c27556b-9c6e-4a2d-9c2d-78f471392a85","Type":"ContainerStarted","Data":"b8859c2d6dbf2c3036472fb7372c756f30483c401815a54dc3fc148797cfed21"} Oct 02 06:58:26 crc kubenswrapper[4786]: I1002 06:58:26.724887 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-7xngz" event={"ID":"3d6bf83e-81f7-4c56-994f-1058c8ddbe74","Type":"ContainerStarted","Data":"179dae4b53ef2ea4e52337e0802720d62942f60d3c4e64887be935917bb4233d"} Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.497555 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.497609 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.731889 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-8cfbh" event={"ID":"71792270-25e9-4028-b306-c235e6378802","Type":"ContainerStarted","Data":"81377fa560c490a479d10db70f50400ebce3f962ca4a5f086b3b047e71c1dca8"} Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.732033 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-8cfbh" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.734367 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-m99xq" event={"ID":"8c50728d-dcc7-4cc1-a0fb-ef8c1355d6f9","Type":"ContainerStarted","Data":"22f85b6de51e0e78c19fede299998ed17da637e8dc64c2383233de84f16db8e9"} Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.734491 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-m99xq" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.737244 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-4rqms" event={"ID":"e229888c-d55e-4b2f-a53a-e588868f98e2","Type":"ContainerStarted","Data":"a1e74e0c2a64111aeb6a971dc18f06af0be433976b6d7006dd4a5cc13d72c976"} Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.737322 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-4rqms" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.739722 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-jf2qh" event={"ID":"bafce6df-ad32-4af9-9e38-d6da16305ee9","Type":"ContainerStarted","Data":"a0dd7524e7c4edd47493caa5b03039b9bb202a8257463c1e17ee851599aa0880"} Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.739852 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-jf2qh" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.741967 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-5b4mb" event={"ID":"90cbbf62-df45-490f-a76d-6b24fdfe6aa7","Type":"ContainerStarted","Data":"a09955a170a1779a6cf83237be34ac2efa8c44a72615d93756cbd5e20123c864"} Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.742092 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-5b4mb" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.744080 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-6n59c" event={"ID":"810706c5-80b3-47b8-8058-2b0aa1665942","Type":"ContainerStarted","Data":"9d0cfdbfde82cbef50cee2af739aea2eedefd5f93eaf681bb432db36f8db2adf"} Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.744198 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-6n59c" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.747402 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-x6hzs" event={"ID":"e49d1086-42df-449b-97b8-787edd49ba23","Type":"ContainerStarted","Data":"af70ecaddcc3ed5614e18a42b2d46a010cf98d5c019cd555227ef73a28fbde91"} Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.747521 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-x6hzs" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.747996 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-8cfbh" podStartSLOduration=3.126210869 podStartE2EDuration="10.747987048s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.506378583 +0000 UTC m=+708.627561714" lastFinishedPulling="2025-10-02 06:58:26.128154763 +0000 UTC m=+716.249337893" observedRunningTime="2025-10-02 06:58:27.745848374 +0000 UTC m=+717.867031505" watchObservedRunningTime="2025-10-02 06:58:27.747987048 +0000 UTC m=+717.869170179" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.749713 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-dkqvh" event={"ID":"0872cd5d-1fde-4b27-bdb7-6eade27cee9d","Type":"ContainerStarted","Data":"4922de93ed37618d55070cfd84530102c1fc39d5763be538afa0ccfd0cbcc838"} Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.749829 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-dkqvh" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.751500 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-w6s6f" event={"ID":"40d01a1d-0613-43eb-824b-24b22a879822","Type":"ContainerStarted","Data":"ba9b7406dd60610f99df39a8d13d266b695feffda4a3574e83a96de014549fb8"} Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.751558 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-w6s6f" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.753351 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-xcd8d" event={"ID":"9c27556b-9c6e-4a2d-9c2d-78f471392a85","Type":"ContainerStarted","Data":"db2ea5974d3c6acca8f8e51628880702ecd24b6f3dcbd1860a547eb3af01016e"} Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.753523 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-xcd8d" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.755031 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-7xngz" event={"ID":"3d6bf83e-81f7-4c56-994f-1058c8ddbe74","Type":"ContainerStarted","Data":"0c4a0bb19b75a52d677cc34082e188ae24e711e28fa54376505043b9766d1fbb"} Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.755086 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-7xngz" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.756418 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-xqp2m" event={"ID":"458e49e3-37fa-4fde-b98e-35f6490ad3bc","Type":"ContainerStarted","Data":"6b8d645216fbb78ca395152018d6ed0c1bd24054523d0b6c4d6d1508012dfe45"} Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.756543 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-xqp2m" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.761238 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-m99xq" podStartSLOduration=3.135436927 podStartE2EDuration="10.761216178s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.492132406 +0000 UTC m=+708.613315537" lastFinishedPulling="2025-10-02 06:58:26.117911657 +0000 UTC m=+716.239094788" observedRunningTime="2025-10-02 06:58:27.759847727 +0000 UTC m=+717.881030868" watchObservedRunningTime="2025-10-02 06:58:27.761216178 +0000 UTC m=+717.882399309" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.773811 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-jf2qh" podStartSLOduration=3.136124716 podStartE2EDuration="10.773800321s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.489563719 +0000 UTC m=+708.610746851" lastFinishedPulling="2025-10-02 06:58:26.127239325 +0000 UTC m=+716.248422456" observedRunningTime="2025-10-02 06:58:27.769953746 +0000 UTC m=+717.891136887" watchObservedRunningTime="2025-10-02 06:58:27.773800321 +0000 UTC m=+717.894983452" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.795941 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-6n59c" podStartSLOduration=3.472348312 podStartE2EDuration="10.795929466s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.837922885 +0000 UTC m=+708.959106016" lastFinishedPulling="2025-10-02 06:58:26.161504039 +0000 UTC m=+716.282687170" observedRunningTime="2025-10-02 06:58:27.795913306 +0000 UTC m=+717.917096447" watchObservedRunningTime="2025-10-02 06:58:27.795929466 +0000 UTC m=+717.917112597" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.796257 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-4rqms" podStartSLOduration=3.153104104 podStartE2EDuration="10.796252945s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.486887842 +0000 UTC m=+708.608070973" lastFinishedPulling="2025-10-02 06:58:26.130036682 +0000 UTC m=+716.251219814" observedRunningTime="2025-10-02 06:58:27.785463958 +0000 UTC m=+717.906647099" watchObservedRunningTime="2025-10-02 06:58:27.796252945 +0000 UTC m=+717.917436077" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.812965 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-5b4mb" podStartSLOduration=2.9665461029999998 podStartE2EDuration="10.812944596s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.281876145 +0000 UTC m=+708.403059277" lastFinishedPulling="2025-10-02 06:58:26.128274638 +0000 UTC m=+716.249457770" observedRunningTime="2025-10-02 06:58:27.808087776 +0000 UTC m=+717.929270917" watchObservedRunningTime="2025-10-02 06:58:27.812944596 +0000 UTC m=+717.934127727" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.821242 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-xqp2m" podStartSLOduration=3.1364722130000002 podStartE2EDuration="10.821222288s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.433093633 +0000 UTC m=+708.554276765" lastFinishedPulling="2025-10-02 06:58:26.11784371 +0000 UTC m=+716.239026840" observedRunningTime="2025-10-02 06:58:27.82015754 +0000 UTC m=+717.941340671" watchObservedRunningTime="2025-10-02 06:58:27.821222288 +0000 UTC m=+717.942405418" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.835108 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-dkqvh" podStartSLOduration=2.993427434 podStartE2EDuration="10.835097866s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.281901685 +0000 UTC m=+708.403084815" lastFinishedPulling="2025-10-02 06:58:26.123572117 +0000 UTC m=+716.244755247" observedRunningTime="2025-10-02 06:58:27.832425306 +0000 UTC m=+717.953608437" watchObservedRunningTime="2025-10-02 06:58:27.835097866 +0000 UTC m=+717.956280997" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.847767 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-w6s6f" podStartSLOduration=3.186084744 podStartE2EDuration="10.847747002s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.470959822 +0000 UTC m=+708.592142953" lastFinishedPulling="2025-10-02 06:58:26.13262208 +0000 UTC m=+716.253805211" observedRunningTime="2025-10-02 06:58:27.843592497 +0000 UTC m=+717.964775628" watchObservedRunningTime="2025-10-02 06:58:27.847747002 +0000 UTC m=+717.968930134" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.854124 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-xcd8d" podStartSLOduration=3.171244682 podStartE2EDuration="10.854115202s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.482186872 +0000 UTC m=+708.603370004" lastFinishedPulling="2025-10-02 06:58:26.165057393 +0000 UTC m=+716.286240524" observedRunningTime="2025-10-02 06:58:27.853055153 +0000 UTC m=+717.974238284" watchObservedRunningTime="2025-10-02 06:58:27.854115202 +0000 UTC m=+717.975298333" Oct 02 06:58:27 crc kubenswrapper[4786]: I1002 06:58:27.873141 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-x6hzs" podStartSLOduration=3.255640442 podStartE2EDuration="10.873119654s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.510321021 +0000 UTC m=+708.631504152" lastFinishedPulling="2025-10-02 06:58:26.127800233 +0000 UTC m=+716.248983364" observedRunningTime="2025-10-02 06:58:27.867162559 +0000 UTC m=+717.988345690" watchObservedRunningTime="2025-10-02 06:58:27.873119654 +0000 UTC m=+717.994302785" Oct 02 06:58:29 crc kubenswrapper[4786]: I1002 06:58:29.876346 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-67698bcd47-cjhhk" Oct 02 06:58:29 crc kubenswrapper[4786]: I1002 06:58:29.896492 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-7xngz" podStartSLOduration=5.406117306 podStartE2EDuration="12.896474601s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.643090694 +0000 UTC m=+708.764273825" lastFinishedPulling="2025-10-02 06:58:26.133447989 +0000 UTC m=+716.254631120" observedRunningTime="2025-10-02 06:58:27.881918359 +0000 UTC m=+718.003101500" watchObservedRunningTime="2025-10-02 06:58:29.896474601 +0000 UTC m=+720.017657733" Oct 02 06:58:32 crc kubenswrapper[4786]: I1002 06:58:32.787719 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-jpgjv" event={"ID":"12f9e929-b1eb-4dd9-a686-0154f89b5dfc","Type":"ContainerStarted","Data":"3ed08fe987d16883f1038112001ebc1721157599bb60348f5f9198a4aedbc874"} Oct 02 06:58:32 crc kubenswrapper[4786]: I1002 06:58:32.788937 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-jpgjv" Oct 02 06:58:32 crc kubenswrapper[4786]: I1002 06:58:32.804347 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-jpgjv" podStartSLOduration=2.141481287 podStartE2EDuration="15.804328686s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.680206236 +0000 UTC m=+708.801389368" lastFinishedPulling="2025-10-02 06:58:32.343053636 +0000 UTC m=+722.464236767" observedRunningTime="2025-10-02 06:58:32.799427313 +0000 UTC m=+722.920610454" watchObservedRunningTime="2025-10-02 06:58:32.804328686 +0000 UTC m=+722.925511818" Oct 02 06:58:33 crc kubenswrapper[4786]: I1002 06:58:33.795329 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-p85rq" event={"ID":"7aaa7a8a-2599-4f27-9bce-1fcee450fbfa","Type":"ContainerStarted","Data":"854ca918ebc475cea8354163fdd5857ea61085a0a279999718d49a21178e6637"} Oct 02 06:58:33 crc kubenswrapper[4786]: I1002 06:58:33.795829 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-p85rq" Oct 02 06:58:33 crc kubenswrapper[4786]: I1002 06:58:33.810881 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-p85rq" podStartSLOduration=1.873894903 podStartE2EDuration="16.810863801s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.658703549 +0000 UTC m=+708.779886680" lastFinishedPulling="2025-10-02 06:58:33.595672448 +0000 UTC m=+723.716855578" observedRunningTime="2025-10-02 06:58:33.808220696 +0000 UTC m=+723.929403848" watchObservedRunningTime="2025-10-02 06:58:33.810863801 +0000 UTC m=+723.932046933" Oct 02 06:58:35 crc kubenswrapper[4786]: I1002 06:58:35.810534 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-phjsn" event={"ID":"fcbaee3e-a527-4eb0-9c2b-7ada804e7920","Type":"ContainerStarted","Data":"a61ed1da1d591c16b20e7ca3d93643b5e9d849f5b7da5f3e4d5f1dd090c7cb86"} Oct 02 06:58:35 crc kubenswrapper[4786]: I1002 06:58:35.811198 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-phjsn" Oct 02 06:58:35 crc kubenswrapper[4786]: I1002 06:58:35.812855 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp" event={"ID":"903d28be-eebf-4dd0-bd15-3f3e2a9416bf","Type":"ContainerStarted","Data":"f15a629b903f1cc67ae2aa5cc6933d31cd1d6a108c4cb1a718cd4d1429f9a15c"} Oct 02 06:58:35 crc kubenswrapper[4786]: I1002 06:58:35.813063 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp" Oct 02 06:58:35 crc kubenswrapper[4786]: I1002 06:58:35.816593 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-gslcx" event={"ID":"8e441bad-8657-4e25-a66b-caf4fc28ae8a","Type":"ContainerStarted","Data":"64da0d8f7a74b0b8345f56cd1b80b2e3a2aa50ca4a75aa6ffc6450c7d3180a03"} Oct 02 06:58:35 crc kubenswrapper[4786]: I1002 06:58:35.816832 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-gslcx" Oct 02 06:58:35 crc kubenswrapper[4786]: I1002 06:58:35.818202 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-r2qcj" event={"ID":"ed52f44e-1a43-4d07-a613-0d5d7c367a51","Type":"ContainerStarted","Data":"c932c7d4de1445eebaf3e307159ef3abe9821c76bbde8a3bf99fdbff8d1fbc6e"} Oct 02 06:58:35 crc kubenswrapper[4786]: I1002 06:58:35.818961 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-r2qcj" Oct 02 06:58:35 crc kubenswrapper[4786]: I1002 06:58:35.825102 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-phjsn" podStartSLOduration=2.08573103 podStartE2EDuration="18.825084011s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.862639626 +0000 UTC m=+708.983822757" lastFinishedPulling="2025-10-02 06:58:35.601992607 +0000 UTC m=+725.723175738" observedRunningTime="2025-10-02 06:58:35.823496015 +0000 UTC m=+725.944679156" watchObservedRunningTime="2025-10-02 06:58:35.825084011 +0000 UTC m=+725.946267141" Oct 02 06:58:35 crc kubenswrapper[4786]: I1002 06:58:35.838106 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp" podStartSLOduration=2.077142176 podStartE2EDuration="18.838092654s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.853140846 +0000 UTC m=+708.974323977" lastFinishedPulling="2025-10-02 06:58:35.614091324 +0000 UTC m=+725.735274455" observedRunningTime="2025-10-02 06:58:35.834534414 +0000 UTC m=+725.955717555" watchObservedRunningTime="2025-10-02 06:58:35.838092654 +0000 UTC m=+725.959275785" Oct 02 06:58:35 crc kubenswrapper[4786]: I1002 06:58:35.848710 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-gslcx" podStartSLOduration=2.094929816 podStartE2EDuration="18.848683649s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.844037561 +0000 UTC m=+708.965220692" lastFinishedPulling="2025-10-02 06:58:35.597791393 +0000 UTC m=+725.718974525" observedRunningTime="2025-10-02 06:58:35.845924084 +0000 UTC m=+725.967107226" watchObservedRunningTime="2025-10-02 06:58:35.848683649 +0000 UTC m=+725.969866780" Oct 02 06:58:35 crc kubenswrapper[4786]: I1002 06:58:35.859561 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-r2qcj" podStartSLOduration=1.907982485 podStartE2EDuration="18.859540023s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.648893071 +0000 UTC m=+708.770076202" lastFinishedPulling="2025-10-02 06:58:35.600450609 +0000 UTC m=+725.721633740" observedRunningTime="2025-10-02 06:58:35.856082692 +0000 UTC m=+725.977265834" watchObservedRunningTime="2025-10-02 06:58:35.859540023 +0000 UTC m=+725.980723154" Oct 02 06:58:37 crc kubenswrapper[4786]: I1002 06:58:37.569854 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859cd486d-5b4mb" Oct 02 06:58:37 crc kubenswrapper[4786]: I1002 06:58:37.575601 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-f7f98cb69-dkqvh" Oct 02 06:58:37 crc kubenswrapper[4786]: I1002 06:58:37.598575 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-77fb7bcf5b-4rqms" Oct 02 06:58:37 crc kubenswrapper[4786]: I1002 06:58:37.608778 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8bc4775b5-w6s6f" Oct 02 06:58:37 crc kubenswrapper[4786]: I1002 06:58:37.629179 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5b4fc86755-xqp2m" Oct 02 06:58:37 crc kubenswrapper[4786]: I1002 06:58:37.654217 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-679b4759bb-xcd8d" Oct 02 06:58:37 crc kubenswrapper[4786]: I1002 06:58:37.660328 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5f45cd594f-m99xq" Oct 02 06:58:37 crc kubenswrapper[4786]: I1002 06:58:37.700944 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-59d7dc95cf-jf2qh" Oct 02 06:58:37 crc kubenswrapper[4786]: I1002 06:58:37.718029 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-b7cf8cb5f-x6hzs" Oct 02 06:58:37 crc kubenswrapper[4786]: I1002 06:58:37.722367 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf5bb885-7xngz" Oct 02 06:58:37 crc kubenswrapper[4786]: I1002 06:58:37.749228 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54fbbfcd44-jpgjv" Oct 02 06:58:37 crc kubenswrapper[4786]: I1002 06:58:37.755092 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7fd5b6bbc6-8cfbh" Oct 02 06:58:37 crc kubenswrapper[4786]: I1002 06:58:37.831947 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-xv465" event={"ID":"f496c1c2-326e-4429-b192-07a5ca33b28d","Type":"ContainerStarted","Data":"f45efba0052780c454775203d4e40820663697e9d3c46da65a770c8eb744d087"} Oct 02 06:58:37 crc kubenswrapper[4786]: I1002 06:58:37.832190 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-xv465" Oct 02 06:58:37 crc kubenswrapper[4786]: I1002 06:58:37.835744 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-89t2n" event={"ID":"ae767b5f-e700-4253-be42-48f90b9fe99e","Type":"ContainerStarted","Data":"a4c58ce240a3c53c540e958034931c0bde33b2ba011bd0a56b38a1307deeff85"} Oct 02 06:58:37 crc kubenswrapper[4786]: I1002 06:58:37.836001 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-89t2n" Oct 02 06:58:37 crc kubenswrapper[4786]: I1002 06:58:37.864844 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-89t2n" podStartSLOduration=2.201956124 podStartE2EDuration="20.864818899s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.653939652 +0000 UTC m=+708.775122782" lastFinishedPulling="2025-10-02 06:58:37.316802426 +0000 UTC m=+727.437985557" observedRunningTime="2025-10-02 06:58:37.862109419 +0000 UTC m=+727.983292550" watchObservedRunningTime="2025-10-02 06:58:37.864818899 +0000 UTC m=+727.986002030" Oct 02 06:58:37 crc kubenswrapper[4786]: I1002 06:58:37.866030 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-xv465" podStartSLOduration=2.228603645 podStartE2EDuration="20.866020675s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.677803794 +0000 UTC m=+708.798986925" lastFinishedPulling="2025-10-02 06:58:37.315220823 +0000 UTC m=+727.436403955" observedRunningTime="2025-10-02 06:58:37.848633644 +0000 UTC m=+727.969816795" watchObservedRunningTime="2025-10-02 06:58:37.866020675 +0000 UTC m=+727.987203806" Oct 02 06:58:38 crc kubenswrapper[4786]: I1002 06:58:38.188056 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-68d7bc5569-6n59c" Oct 02 06:58:38 crc kubenswrapper[4786]: I1002 06:58:38.842123 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5" event={"ID":"4910bc83-927a-41bb-a4a6-834c09bcbc6e","Type":"ContainerStarted","Data":"c93c63b28525c042e9c4239ef8c4b794c6be2d9cde01902656e9f6ce53d4e1e8"} Oct 02 06:58:38 crc kubenswrapper[4786]: I1002 06:58:38.843983 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm" event={"ID":"56dd0e8b-9081-4dbc-8683-a0da0f1be122","Type":"ContainerStarted","Data":"9b35c632e716bd0bbf5d0bf277b17dc7bb32344a28de19d365e744be0f2e0f0a"} Oct 02 06:58:38 crc kubenswrapper[4786]: I1002 06:58:38.844210 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm" Oct 02 06:58:38 crc kubenswrapper[4786]: I1002 06:58:38.854407 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5" podStartSLOduration=1.077450051 podStartE2EDuration="20.854398887s" podCreationTimestamp="2025-10-02 06:58:18 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.861651452 +0000 UTC m=+708.982834583" lastFinishedPulling="2025-10-02 06:58:38.638600287 +0000 UTC m=+728.759783419" observedRunningTime="2025-10-02 06:58:38.853420432 +0000 UTC m=+728.974603563" watchObservedRunningTime="2025-10-02 06:58:38.854398887 +0000 UTC m=+728.975582017" Oct 02 06:58:38 crc kubenswrapper[4786]: I1002 06:58:38.874910 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm" podStartSLOduration=2.104532003 podStartE2EDuration="21.874894932s" podCreationTimestamp="2025-10-02 06:58:17 +0000 UTC" firstStartedPulling="2025-10-02 06:58:18.872745 +0000 UTC m=+708.993928131" lastFinishedPulling="2025-10-02 06:58:38.643107929 +0000 UTC m=+728.764291060" observedRunningTime="2025-10-02 06:58:38.871484419 +0000 UTC m=+728.992667550" watchObservedRunningTime="2025-10-02 06:58:38.874894932 +0000 UTC m=+728.996078063" Oct 02 06:58:47 crc kubenswrapper[4786]: I1002 06:58:47.758574 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-75f8d67d86-xv465" Oct 02 06:58:47 crc kubenswrapper[4786]: I1002 06:58:47.783356 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-84c745747f-p85rq" Oct 02 06:58:47 crc kubenswrapper[4786]: I1002 06:58:47.804884 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-598c4c8547-r2qcj" Oct 02 06:58:47 crc kubenswrapper[4786]: I1002 06:58:47.952623 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-689b4f76c9-gslcx" Oct 02 06:58:47 crc kubenswrapper[4786]: I1002 06:58:47.990326 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-cb66d6b59-89t2n" Oct 02 06:58:48 crc kubenswrapper[4786]: I1002 06:58:48.143346 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-cbdf6dc66-phjsn" Oct 02 06:58:48 crc kubenswrapper[4786]: I1002 06:58:48.240976 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5c8fdc4d5c-5vzvp" Oct 02 06:58:48 crc kubenswrapper[4786]: I1002 06:58:48.385431 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-787874f5b7tcwtm" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.130896 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nxvrf"] Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.131541 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" podUID="4d8c9c2b-6b0f-4173-91c9-1dbed356baa9" containerName="controller-manager" containerID="cri-o://4ded5e33257b00dd32d3530875545645904eed75a5999eefeadc5c6d210368b2" gracePeriod=30 Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.241148 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q"] Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.241328 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" podUID="6647561d-fd2f-4b60-b155-ffbc54b0da4f" containerName="route-controller-manager" containerID="cri-o://9298bce7c639cf9369af7deb476a478e6c6ad1ee02bd990a3254abefd0674d94" gracePeriod=30 Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.474753 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.530593 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2sdk\" (UniqueName: \"kubernetes.io/projected/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-kube-api-access-l2sdk\") pod \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.530670 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-config\") pod \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.530728 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-proxy-ca-bundles\") pod \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.530763 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-client-ca\") pod \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.530783 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-serving-cert\") pod \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\" (UID: \"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9\") " Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.531544 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4d8c9c2b-6b0f-4173-91c9-1dbed356baa9" (UID: "4d8c9c2b-6b0f-4173-91c9-1dbed356baa9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.531559 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-client-ca" (OuterVolumeSpecName: "client-ca") pod "4d8c9c2b-6b0f-4173-91c9-1dbed356baa9" (UID: "4d8c9c2b-6b0f-4173-91c9-1dbed356baa9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.531624 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-config" (OuterVolumeSpecName: "config") pod "4d8c9c2b-6b0f-4173-91c9-1dbed356baa9" (UID: "4d8c9c2b-6b0f-4173-91c9-1dbed356baa9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.536554 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-kube-api-access-l2sdk" (OuterVolumeSpecName: "kube-api-access-l2sdk") pod "4d8c9c2b-6b0f-4173-91c9-1dbed356baa9" (UID: "4d8c9c2b-6b0f-4173-91c9-1dbed356baa9"). InnerVolumeSpecName "kube-api-access-l2sdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.537945 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4d8c9c2b-6b0f-4173-91c9-1dbed356baa9" (UID: "4d8c9c2b-6b0f-4173-91c9-1dbed356baa9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.564023 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.631631 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6647561d-fd2f-4b60-b155-ffbc54b0da4f-client-ca\") pod \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\" (UID: \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\") " Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.631701 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6647561d-fd2f-4b60-b155-ffbc54b0da4f-config\") pod \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\" (UID: \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\") " Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.631748 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6647561d-fd2f-4b60-b155-ffbc54b0da4f-serving-cert\") pod \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\" (UID: \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\") " Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.631806 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9j8t\" (UniqueName: \"kubernetes.io/projected/6647561d-fd2f-4b60-b155-ffbc54b0da4f-kube-api-access-h9j8t\") pod \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\" (UID: \"6647561d-fd2f-4b60-b155-ffbc54b0da4f\") " Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.632012 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.632034 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.632043 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2sdk\" (UniqueName: \"kubernetes.io/projected/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-kube-api-access-l2sdk\") on node \"crc\" DevicePath \"\"" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.632052 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.632063 4786 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.632456 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6647561d-fd2f-4b60-b155-ffbc54b0da4f-client-ca" (OuterVolumeSpecName: "client-ca") pod "6647561d-fd2f-4b60-b155-ffbc54b0da4f" (UID: "6647561d-fd2f-4b60-b155-ffbc54b0da4f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.632497 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6647561d-fd2f-4b60-b155-ffbc54b0da4f-config" (OuterVolumeSpecName: "config") pod "6647561d-fd2f-4b60-b155-ffbc54b0da4f" (UID: "6647561d-fd2f-4b60-b155-ffbc54b0da4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.634950 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6647561d-fd2f-4b60-b155-ffbc54b0da4f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6647561d-fd2f-4b60-b155-ffbc54b0da4f" (UID: "6647561d-fd2f-4b60-b155-ffbc54b0da4f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.635724 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6647561d-fd2f-4b60-b155-ffbc54b0da4f-kube-api-access-h9j8t" (OuterVolumeSpecName: "kube-api-access-h9j8t") pod "6647561d-fd2f-4b60-b155-ffbc54b0da4f" (UID: "6647561d-fd2f-4b60-b155-ffbc54b0da4f"). InnerVolumeSpecName "kube-api-access-h9j8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.733396 4786 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6647561d-fd2f-4b60-b155-ffbc54b0da4f-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.733425 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6647561d-fd2f-4b60-b155-ffbc54b0da4f-config\") on node \"crc\" DevicePath \"\"" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.733435 4786 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6647561d-fd2f-4b60-b155-ffbc54b0da4f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.733445 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9j8t\" (UniqueName: \"kubernetes.io/projected/6647561d-fd2f-4b60-b155-ffbc54b0da4f-kube-api-access-h9j8t\") on node \"crc\" DevicePath \"\"" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.909434 4786 generic.go:334] "Generic (PLEG): container finished" podID="6647561d-fd2f-4b60-b155-ffbc54b0da4f" containerID="9298bce7c639cf9369af7deb476a478e6c6ad1ee02bd990a3254abefd0674d94" exitCode=0 Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.909474 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.909517 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" event={"ID":"6647561d-fd2f-4b60-b155-ffbc54b0da4f","Type":"ContainerDied","Data":"9298bce7c639cf9369af7deb476a478e6c6ad1ee02bd990a3254abefd0674d94"} Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.909545 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q" event={"ID":"6647561d-fd2f-4b60-b155-ffbc54b0da4f","Type":"ContainerDied","Data":"bfdf0e7d3c15d0d021fd2be732ce342ccf03a7606372e3f925baac13f1ed2fe9"} Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.909562 4786 scope.go:117] "RemoveContainer" containerID="9298bce7c639cf9369af7deb476a478e6c6ad1ee02bd990a3254abefd0674d94" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.910861 4786 generic.go:334] "Generic (PLEG): container finished" podID="4d8c9c2b-6b0f-4173-91c9-1dbed356baa9" containerID="4ded5e33257b00dd32d3530875545645904eed75a5999eefeadc5c6d210368b2" exitCode=0 Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.910902 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" event={"ID":"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9","Type":"ContainerDied","Data":"4ded5e33257b00dd32d3530875545645904eed75a5999eefeadc5c6d210368b2"} Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.910925 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" event={"ID":"4d8c9c2b-6b0f-4173-91c9-1dbed356baa9","Type":"ContainerDied","Data":"a7d64391a90a24288a8ae42cffb251ccb753d39c045c9738e6a66a0ef604305a"} Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.910974 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nxvrf" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.924147 4786 scope.go:117] "RemoveContainer" containerID="9298bce7c639cf9369af7deb476a478e6c6ad1ee02bd990a3254abefd0674d94" Oct 02 06:58:50 crc kubenswrapper[4786]: E1002 06:58:50.924872 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9298bce7c639cf9369af7deb476a478e6c6ad1ee02bd990a3254abefd0674d94\": container with ID starting with 9298bce7c639cf9369af7deb476a478e6c6ad1ee02bd990a3254abefd0674d94 not found: ID does not exist" containerID="9298bce7c639cf9369af7deb476a478e6c6ad1ee02bd990a3254abefd0674d94" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.924908 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9298bce7c639cf9369af7deb476a478e6c6ad1ee02bd990a3254abefd0674d94"} err="failed to get container status \"9298bce7c639cf9369af7deb476a478e6c6ad1ee02bd990a3254abefd0674d94\": rpc error: code = NotFound desc = could not find container \"9298bce7c639cf9369af7deb476a478e6c6ad1ee02bd990a3254abefd0674d94\": container with ID starting with 9298bce7c639cf9369af7deb476a478e6c6ad1ee02bd990a3254abefd0674d94 not found: ID does not exist" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.924932 4786 scope.go:117] "RemoveContainer" containerID="4ded5e33257b00dd32d3530875545645904eed75a5999eefeadc5c6d210368b2" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.938587 4786 scope.go:117] "RemoveContainer" containerID="4ded5e33257b00dd32d3530875545645904eed75a5999eefeadc5c6d210368b2" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.938987 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q"] Oct 02 06:58:50 crc kubenswrapper[4786]: E1002 06:58:50.939015 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ded5e33257b00dd32d3530875545645904eed75a5999eefeadc5c6d210368b2\": container with ID starting with 4ded5e33257b00dd32d3530875545645904eed75a5999eefeadc5c6d210368b2 not found: ID does not exist" containerID="4ded5e33257b00dd32d3530875545645904eed75a5999eefeadc5c6d210368b2" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.939049 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ded5e33257b00dd32d3530875545645904eed75a5999eefeadc5c6d210368b2"} err="failed to get container status \"4ded5e33257b00dd32d3530875545645904eed75a5999eefeadc5c6d210368b2\": rpc error: code = NotFound desc = could not find container \"4ded5e33257b00dd32d3530875545645904eed75a5999eefeadc5c6d210368b2\": container with ID starting with 4ded5e33257b00dd32d3530875545645904eed75a5999eefeadc5c6d210368b2 not found: ID does not exist" Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.945296 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6z59q"] Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.949230 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nxvrf"] Oct 02 06:58:50 crc kubenswrapper[4786]: I1002 06:58:50.953632 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nxvrf"] Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.765633 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd"] Oct 02 06:58:51 crc kubenswrapper[4786]: E1002 06:58:51.766248 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8c9c2b-6b0f-4173-91c9-1dbed356baa9" containerName="controller-manager" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.766264 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8c9c2b-6b0f-4173-91c9-1dbed356baa9" containerName="controller-manager" Oct 02 06:58:51 crc kubenswrapper[4786]: E1002 06:58:51.766295 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6647561d-fd2f-4b60-b155-ffbc54b0da4f" containerName="route-controller-manager" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.766301 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6647561d-fd2f-4b60-b155-ffbc54b0da4f" containerName="route-controller-manager" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.766627 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d8c9c2b-6b0f-4173-91c9-1dbed356baa9" containerName="controller-manager" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.766654 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6647561d-fd2f-4b60-b155-ffbc54b0da4f" containerName="route-controller-manager" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.776025 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.780111 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.780254 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.780467 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.780552 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.780619 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.781007 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.785176 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.786160 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd"] Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.846488 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a71fdcab-cb7c-4489-82d9-69c310a6d352-serving-cert\") pod \"controller-manager-777ddb8f9d-wrjrd\" (UID: \"a71fdcab-cb7c-4489-82d9-69c310a6d352\") " pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.846536 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfr4b\" (UniqueName: \"kubernetes.io/projected/a71fdcab-cb7c-4489-82d9-69c310a6d352-kube-api-access-wfr4b\") pod \"controller-manager-777ddb8f9d-wrjrd\" (UID: \"a71fdcab-cb7c-4489-82d9-69c310a6d352\") " pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.846571 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a71fdcab-cb7c-4489-82d9-69c310a6d352-config\") pod \"controller-manager-777ddb8f9d-wrjrd\" (UID: \"a71fdcab-cb7c-4489-82d9-69c310a6d352\") " pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.846636 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a71fdcab-cb7c-4489-82d9-69c310a6d352-proxy-ca-bundles\") pod \"controller-manager-777ddb8f9d-wrjrd\" (UID: \"a71fdcab-cb7c-4489-82d9-69c310a6d352\") " pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.846683 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a71fdcab-cb7c-4489-82d9-69c310a6d352-client-ca\") pod \"controller-manager-777ddb8f9d-wrjrd\" (UID: \"a71fdcab-cb7c-4489-82d9-69c310a6d352\") " pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.887536 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8"] Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.888808 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.890908 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.892548 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.892844 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.893018 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.893549 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.895750 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.900008 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8"] Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.948295 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a71fdcab-cb7c-4489-82d9-69c310a6d352-serving-cert\") pod \"controller-manager-777ddb8f9d-wrjrd\" (UID: \"a71fdcab-cb7c-4489-82d9-69c310a6d352\") " pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.948340 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfr4b\" (UniqueName: \"kubernetes.io/projected/a71fdcab-cb7c-4489-82d9-69c310a6d352-kube-api-access-wfr4b\") pod \"controller-manager-777ddb8f9d-wrjrd\" (UID: \"a71fdcab-cb7c-4489-82d9-69c310a6d352\") " pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.948367 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e0dea13-c646-44d1-997c-d27c83bb9427-serving-cert\") pod \"route-controller-manager-74796b6877-rgzn8\" (UID: \"8e0dea13-c646-44d1-997c-d27c83bb9427\") " pod="openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.948407 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a71fdcab-cb7c-4489-82d9-69c310a6d352-config\") pod \"controller-manager-777ddb8f9d-wrjrd\" (UID: \"a71fdcab-cb7c-4489-82d9-69c310a6d352\") " pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.948431 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e0dea13-c646-44d1-997c-d27c83bb9427-client-ca\") pod \"route-controller-manager-74796b6877-rgzn8\" (UID: \"8e0dea13-c646-44d1-997c-d27c83bb9427\") " pod="openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.948506 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w26ns\" (UniqueName: \"kubernetes.io/projected/8e0dea13-c646-44d1-997c-d27c83bb9427-kube-api-access-w26ns\") pod \"route-controller-manager-74796b6877-rgzn8\" (UID: \"8e0dea13-c646-44d1-997c-d27c83bb9427\") " pod="openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.948530 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e0dea13-c646-44d1-997c-d27c83bb9427-config\") pod \"route-controller-manager-74796b6877-rgzn8\" (UID: \"8e0dea13-c646-44d1-997c-d27c83bb9427\") " pod="openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.948802 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a71fdcab-cb7c-4489-82d9-69c310a6d352-proxy-ca-bundles\") pod \"controller-manager-777ddb8f9d-wrjrd\" (UID: \"a71fdcab-cb7c-4489-82d9-69c310a6d352\") " pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.948995 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a71fdcab-cb7c-4489-82d9-69c310a6d352-client-ca\") pod \"controller-manager-777ddb8f9d-wrjrd\" (UID: \"a71fdcab-cb7c-4489-82d9-69c310a6d352\") " pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.949802 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a71fdcab-cb7c-4489-82d9-69c310a6d352-config\") pod \"controller-manager-777ddb8f9d-wrjrd\" (UID: \"a71fdcab-cb7c-4489-82d9-69c310a6d352\") " pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.949865 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a71fdcab-cb7c-4489-82d9-69c310a6d352-client-ca\") pod \"controller-manager-777ddb8f9d-wrjrd\" (UID: \"a71fdcab-cb7c-4489-82d9-69c310a6d352\") " pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.950639 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a71fdcab-cb7c-4489-82d9-69c310a6d352-proxy-ca-bundles\") pod \"controller-manager-777ddb8f9d-wrjrd\" (UID: \"a71fdcab-cb7c-4489-82d9-69c310a6d352\") " pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.955311 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a71fdcab-cb7c-4489-82d9-69c310a6d352-serving-cert\") pod \"controller-manager-777ddb8f9d-wrjrd\" (UID: \"a71fdcab-cb7c-4489-82d9-69c310a6d352\") " pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" Oct 02 06:58:51 crc kubenswrapper[4786]: I1002 06:58:51.962437 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfr4b\" (UniqueName: \"kubernetes.io/projected/a71fdcab-cb7c-4489-82d9-69c310a6d352-kube-api-access-wfr4b\") pod \"controller-manager-777ddb8f9d-wrjrd\" (UID: \"a71fdcab-cb7c-4489-82d9-69c310a6d352\") " pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.050053 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w26ns\" (UniqueName: \"kubernetes.io/projected/8e0dea13-c646-44d1-997c-d27c83bb9427-kube-api-access-w26ns\") pod \"route-controller-manager-74796b6877-rgzn8\" (UID: \"8e0dea13-c646-44d1-997c-d27c83bb9427\") " pod="openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8" Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.050097 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e0dea13-c646-44d1-997c-d27c83bb9427-config\") pod \"route-controller-manager-74796b6877-rgzn8\" (UID: \"8e0dea13-c646-44d1-997c-d27c83bb9427\") " pod="openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8" Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.050254 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e0dea13-c646-44d1-997c-d27c83bb9427-serving-cert\") pod \"route-controller-manager-74796b6877-rgzn8\" (UID: \"8e0dea13-c646-44d1-997c-d27c83bb9427\") " pod="openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8" Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.050312 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e0dea13-c646-44d1-997c-d27c83bb9427-client-ca\") pod \"route-controller-manager-74796b6877-rgzn8\" (UID: \"8e0dea13-c646-44d1-997c-d27c83bb9427\") " pod="openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8" Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.051083 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e0dea13-c646-44d1-997c-d27c83bb9427-client-ca\") pod \"route-controller-manager-74796b6877-rgzn8\" (UID: \"8e0dea13-c646-44d1-997c-d27c83bb9427\") " pod="openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8" Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.051816 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e0dea13-c646-44d1-997c-d27c83bb9427-config\") pod \"route-controller-manager-74796b6877-rgzn8\" (UID: \"8e0dea13-c646-44d1-997c-d27c83bb9427\") " pod="openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8" Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.053371 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e0dea13-c646-44d1-997c-d27c83bb9427-serving-cert\") pod \"route-controller-manager-74796b6877-rgzn8\" (UID: \"8e0dea13-c646-44d1-997c-d27c83bb9427\") " pod="openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8" Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.063644 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w26ns\" (UniqueName: \"kubernetes.io/projected/8e0dea13-c646-44d1-997c-d27c83bb9427-kube-api-access-w26ns\") pod \"route-controller-manager-74796b6877-rgzn8\" (UID: \"8e0dea13-c646-44d1-997c-d27c83bb9427\") " pod="openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8" Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.100317 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.191816 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d8c9c2b-6b0f-4173-91c9-1dbed356baa9" path="/var/lib/kubelet/pods/4d8c9c2b-6b0f-4173-91c9-1dbed356baa9/volumes" Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.192968 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6647561d-fd2f-4b60-b155-ffbc54b0da4f" path="/var/lib/kubelet/pods/6647561d-fd2f-4b60-b155-ffbc54b0da4f/volumes" Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.219580 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8" Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.472438 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd"] Oct 02 06:58:52 crc kubenswrapper[4786]: W1002 06:58:52.480613 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda71fdcab_cb7c_4489_82d9_69c310a6d352.slice/crio-4edd9e4a297308b9fca4ff4615a7b70509877a73453a47db20962ce3e66a6e6a WatchSource:0}: Error finding container 4edd9e4a297308b9fca4ff4615a7b70509877a73453a47db20962ce3e66a6e6a: Status 404 returned error can't find the container with id 4edd9e4a297308b9fca4ff4615a7b70509877a73453a47db20962ce3e66a6e6a Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.597854 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8"] Oct 02 06:58:52 crc kubenswrapper[4786]: W1002 06:58:52.605253 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e0dea13_c646_44d1_997c_d27c83bb9427.slice/crio-440b4fe30328c875dd324b2325bb2909d1bd7b1435f804858828c9f2e8faca7e WatchSource:0}: Error finding container 440b4fe30328c875dd324b2325bb2909d1bd7b1435f804858828c9f2e8faca7e: Status 404 returned error can't find the container with id 440b4fe30328c875dd324b2325bb2909d1bd7b1435f804858828c9f2e8faca7e Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.932837 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8" event={"ID":"8e0dea13-c646-44d1-997c-d27c83bb9427","Type":"ContainerStarted","Data":"20d535d5b0decb04675657bbd53ac71e187b84277d913989496a39ad199281a6"} Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.933575 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8" Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.933643 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8" event={"ID":"8e0dea13-c646-44d1-997c-d27c83bb9427","Type":"ContainerStarted","Data":"440b4fe30328c875dd324b2325bb2909d1bd7b1435f804858828c9f2e8faca7e"} Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.935875 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" event={"ID":"a71fdcab-cb7c-4489-82d9-69c310a6d352","Type":"ContainerStarted","Data":"ee30a9fd76e9bc74973651b436917b842272360aaffa207b8d2f82ddf0699834"} Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.935904 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" event={"ID":"a71fdcab-cb7c-4489-82d9-69c310a6d352","Type":"ContainerStarted","Data":"4edd9e4a297308b9fca4ff4615a7b70509877a73453a47db20962ce3e66a6e6a"} Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.936107 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.939923 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.953456 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8" podStartSLOduration=2.9534419290000002 podStartE2EDuration="2.953441929s" podCreationTimestamp="2025-10-02 06:58:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:58:52.949865173 +0000 UTC m=+743.071048324" watchObservedRunningTime="2025-10-02 06:58:52.953441929 +0000 UTC m=+743.074625060" Oct 02 06:58:52 crc kubenswrapper[4786]: I1002 06:58:52.973085 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-777ddb8f9d-wrjrd" podStartSLOduration=1.9730702180000002 podStartE2EDuration="1.973070218s" podCreationTimestamp="2025-10-02 06:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 06:58:52.968922465 +0000 UTC m=+743.090105606" watchObservedRunningTime="2025-10-02 06:58:52.973070218 +0000 UTC m=+743.094253349" Oct 02 06:58:53 crc kubenswrapper[4786]: I1002 06:58:53.134371 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-74796b6877-rgzn8" Oct 02 06:58:54 crc kubenswrapper[4786]: I1002 06:58:54.874634 4786 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 06:58:57 crc kubenswrapper[4786]: I1002 06:58:57.497749 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 06:58:57 crc kubenswrapper[4786]: I1002 06:58:57.497817 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 06:58:57 crc kubenswrapper[4786]: I1002 06:58:57.497861 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 06:58:57 crc kubenswrapper[4786]: I1002 06:58:57.498496 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"613104429ff7e56e4d88582bf64ddcf8603f93d2b0b8b15a934f56112fabd10d"} pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 06:58:57 crc kubenswrapper[4786]: I1002 06:58:57.498554 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" containerID="cri-o://613104429ff7e56e4d88582bf64ddcf8603f93d2b0b8b15a934f56112fabd10d" gracePeriod=600 Oct 02 06:58:57 crc kubenswrapper[4786]: I1002 06:58:57.965458 4786 generic.go:334] "Generic (PLEG): container finished" podID="79cb22df-4930-4aed-9108-1056074d1000" containerID="613104429ff7e56e4d88582bf64ddcf8603f93d2b0b8b15a934f56112fabd10d" exitCode=0 Oct 02 06:58:57 crc kubenswrapper[4786]: I1002 06:58:57.965509 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerDied","Data":"613104429ff7e56e4d88582bf64ddcf8603f93d2b0b8b15a934f56112fabd10d"} Oct 02 06:58:57 crc kubenswrapper[4786]: I1002 06:58:57.965542 4786 scope.go:117] "RemoveContainer" containerID="6b5bd4f7ef853564be38c5c22d2f88f290b16d0915727305dc89bbed1ec9a81c" Oct 02 06:58:59 crc kubenswrapper[4786]: I1002 06:58:59.977107 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerStarted","Data":"d7bde13c3a2f638d163652c020a2e40b2d8399d146317237502071d1e44c36be"} Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.692978 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85dddbfdbf-85pgc"] Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.694534 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85dddbfdbf-85pgc" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.702801 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.703009 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-m82cl" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.703214 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.704242 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.716645 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85dddbfdbf-85pgc"] Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.756014 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5465cc9897-6w9ft"] Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.756985 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5465cc9897-6w9ft" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.758893 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.767920 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5465cc9897-6w9ft"] Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.786226 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/388590eb-6c04-4f86-9676-11054d02eae7-config\") pod \"dnsmasq-dns-5465cc9897-6w9ft\" (UID: \"388590eb-6c04-4f86-9676-11054d02eae7\") " pod="openstack/dnsmasq-dns-5465cc9897-6w9ft" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.786281 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrk9f\" (UniqueName: \"kubernetes.io/projected/4e99c2f0-d917-400c-9fb9-48abce2f4854-kube-api-access-mrk9f\") pod \"dnsmasq-dns-85dddbfdbf-85pgc\" (UID: \"4e99c2f0-d917-400c-9fb9-48abce2f4854\") " pod="openstack/dnsmasq-dns-85dddbfdbf-85pgc" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.786329 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84zsg\" (UniqueName: \"kubernetes.io/projected/388590eb-6c04-4f86-9676-11054d02eae7-kube-api-access-84zsg\") pod \"dnsmasq-dns-5465cc9897-6w9ft\" (UID: \"388590eb-6c04-4f86-9676-11054d02eae7\") " pod="openstack/dnsmasq-dns-5465cc9897-6w9ft" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.786386 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e99c2f0-d917-400c-9fb9-48abce2f4854-config\") pod \"dnsmasq-dns-85dddbfdbf-85pgc\" (UID: \"4e99c2f0-d917-400c-9fb9-48abce2f4854\") " pod="openstack/dnsmasq-dns-85dddbfdbf-85pgc" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.786411 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/388590eb-6c04-4f86-9676-11054d02eae7-dns-svc\") pod \"dnsmasq-dns-5465cc9897-6w9ft\" (UID: \"388590eb-6c04-4f86-9676-11054d02eae7\") " pod="openstack/dnsmasq-dns-5465cc9897-6w9ft" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.886996 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84zsg\" (UniqueName: \"kubernetes.io/projected/388590eb-6c04-4f86-9676-11054d02eae7-kube-api-access-84zsg\") pod \"dnsmasq-dns-5465cc9897-6w9ft\" (UID: \"388590eb-6c04-4f86-9676-11054d02eae7\") " pod="openstack/dnsmasq-dns-5465cc9897-6w9ft" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.887040 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e99c2f0-d917-400c-9fb9-48abce2f4854-config\") pod \"dnsmasq-dns-85dddbfdbf-85pgc\" (UID: \"4e99c2f0-d917-400c-9fb9-48abce2f4854\") " pod="openstack/dnsmasq-dns-85dddbfdbf-85pgc" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.887065 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/388590eb-6c04-4f86-9676-11054d02eae7-dns-svc\") pod \"dnsmasq-dns-5465cc9897-6w9ft\" (UID: \"388590eb-6c04-4f86-9676-11054d02eae7\") " pod="openstack/dnsmasq-dns-5465cc9897-6w9ft" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.887103 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/388590eb-6c04-4f86-9676-11054d02eae7-config\") pod \"dnsmasq-dns-5465cc9897-6w9ft\" (UID: \"388590eb-6c04-4f86-9676-11054d02eae7\") " pod="openstack/dnsmasq-dns-5465cc9897-6w9ft" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.887138 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrk9f\" (UniqueName: \"kubernetes.io/projected/4e99c2f0-d917-400c-9fb9-48abce2f4854-kube-api-access-mrk9f\") pod \"dnsmasq-dns-85dddbfdbf-85pgc\" (UID: \"4e99c2f0-d917-400c-9fb9-48abce2f4854\") " pod="openstack/dnsmasq-dns-85dddbfdbf-85pgc" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.887987 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e99c2f0-d917-400c-9fb9-48abce2f4854-config\") pod \"dnsmasq-dns-85dddbfdbf-85pgc\" (UID: \"4e99c2f0-d917-400c-9fb9-48abce2f4854\") " pod="openstack/dnsmasq-dns-85dddbfdbf-85pgc" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.887990 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/388590eb-6c04-4f86-9676-11054d02eae7-dns-svc\") pod \"dnsmasq-dns-5465cc9897-6w9ft\" (UID: \"388590eb-6c04-4f86-9676-11054d02eae7\") " pod="openstack/dnsmasq-dns-5465cc9897-6w9ft" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.887987 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/388590eb-6c04-4f86-9676-11054d02eae7-config\") pod \"dnsmasq-dns-5465cc9897-6w9ft\" (UID: \"388590eb-6c04-4f86-9676-11054d02eae7\") " pod="openstack/dnsmasq-dns-5465cc9897-6w9ft" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.905504 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrk9f\" (UniqueName: \"kubernetes.io/projected/4e99c2f0-d917-400c-9fb9-48abce2f4854-kube-api-access-mrk9f\") pod \"dnsmasq-dns-85dddbfdbf-85pgc\" (UID: \"4e99c2f0-d917-400c-9fb9-48abce2f4854\") " pod="openstack/dnsmasq-dns-85dddbfdbf-85pgc" Oct 02 06:59:01 crc kubenswrapper[4786]: I1002 06:59:01.905531 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84zsg\" (UniqueName: \"kubernetes.io/projected/388590eb-6c04-4f86-9676-11054d02eae7-kube-api-access-84zsg\") pod \"dnsmasq-dns-5465cc9897-6w9ft\" (UID: \"388590eb-6c04-4f86-9676-11054d02eae7\") " pod="openstack/dnsmasq-dns-5465cc9897-6w9ft" Oct 02 06:59:02 crc kubenswrapper[4786]: I1002 06:59:02.007163 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85dddbfdbf-85pgc" Oct 02 06:59:02 crc kubenswrapper[4786]: I1002 06:59:02.073606 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5465cc9897-6w9ft" Oct 02 06:59:02 crc kubenswrapper[4786]: I1002 06:59:02.400901 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85dddbfdbf-85pgc"] Oct 02 06:59:02 crc kubenswrapper[4786]: W1002 06:59:02.405664 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e99c2f0_d917_400c_9fb9_48abce2f4854.slice/crio-6239854b1746bf971777b1c7b66626fceb540abaa5551e541415212cb06b4279 WatchSource:0}: Error finding container 6239854b1746bf971777b1c7b66626fceb540abaa5551e541415212cb06b4279: Status 404 returned error can't find the container with id 6239854b1746bf971777b1c7b66626fceb540abaa5551e541415212cb06b4279 Oct 02 06:59:02 crc kubenswrapper[4786]: I1002 06:59:02.510795 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5465cc9897-6w9ft"] Oct 02 06:59:02 crc kubenswrapper[4786]: W1002 06:59:02.514401 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod388590eb_6c04_4f86_9676_11054d02eae7.slice/crio-4ca4fe9f4118797ad1af909c2b876c240b3302826a8345be7f2a5501268ed38e WatchSource:0}: Error finding container 4ca4fe9f4118797ad1af909c2b876c240b3302826a8345be7f2a5501268ed38e: Status 404 returned error can't find the container with id 4ca4fe9f4118797ad1af909c2b876c240b3302826a8345be7f2a5501268ed38e Oct 02 06:59:03 crc kubenswrapper[4786]: I1002 06:59:03.000133 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85dddbfdbf-85pgc" event={"ID":"4e99c2f0-d917-400c-9fb9-48abce2f4854","Type":"ContainerStarted","Data":"6239854b1746bf971777b1c7b66626fceb540abaa5551e541415212cb06b4279"} Oct 02 06:59:03 crc kubenswrapper[4786]: I1002 06:59:03.002288 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5465cc9897-6w9ft" event={"ID":"388590eb-6c04-4f86-9676-11054d02eae7","Type":"ContainerStarted","Data":"4ca4fe9f4118797ad1af909c2b876c240b3302826a8345be7f2a5501268ed38e"} Oct 02 06:59:04 crc kubenswrapper[4786]: I1002 06:59:04.714160 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5465cc9897-6w9ft"] Oct 02 06:59:04 crc kubenswrapper[4786]: I1002 06:59:04.752244 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58cfb8dc65-t9s4x"] Oct 02 06:59:04 crc kubenswrapper[4786]: I1002 06:59:04.753359 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58cfb8dc65-t9s4x" Oct 02 06:59:04 crc kubenswrapper[4786]: I1002 06:59:04.767755 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58cfb8dc65-t9s4x"] Oct 02 06:59:04 crc kubenswrapper[4786]: I1002 06:59:04.933858 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce-dns-svc\") pod \"dnsmasq-dns-58cfb8dc65-t9s4x\" (UID: \"f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce\") " pod="openstack/dnsmasq-dns-58cfb8dc65-t9s4x" Oct 02 06:59:04 crc kubenswrapper[4786]: I1002 06:59:04.934391 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74rtl\" (UniqueName: \"kubernetes.io/projected/f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce-kube-api-access-74rtl\") pod \"dnsmasq-dns-58cfb8dc65-t9s4x\" (UID: \"f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce\") " pod="openstack/dnsmasq-dns-58cfb8dc65-t9s4x" Oct 02 06:59:04 crc kubenswrapper[4786]: I1002 06:59:04.934476 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce-config\") pod \"dnsmasq-dns-58cfb8dc65-t9s4x\" (UID: \"f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce\") " pod="openstack/dnsmasq-dns-58cfb8dc65-t9s4x" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.034267 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85dddbfdbf-85pgc"] Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.036082 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce-dns-svc\") pod \"dnsmasq-dns-58cfb8dc65-t9s4x\" (UID: \"f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce\") " pod="openstack/dnsmasq-dns-58cfb8dc65-t9s4x" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.036231 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74rtl\" (UniqueName: \"kubernetes.io/projected/f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce-kube-api-access-74rtl\") pod \"dnsmasq-dns-58cfb8dc65-t9s4x\" (UID: \"f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce\") " pod="openstack/dnsmasq-dns-58cfb8dc65-t9s4x" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.036294 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce-config\") pod \"dnsmasq-dns-58cfb8dc65-t9s4x\" (UID: \"f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce\") " pod="openstack/dnsmasq-dns-58cfb8dc65-t9s4x" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.037155 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce-config\") pod \"dnsmasq-dns-58cfb8dc65-t9s4x\" (UID: \"f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce\") " pod="openstack/dnsmasq-dns-58cfb8dc65-t9s4x" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.037156 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce-dns-svc\") pod \"dnsmasq-dns-58cfb8dc65-t9s4x\" (UID: \"f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce\") " pod="openstack/dnsmasq-dns-58cfb8dc65-t9s4x" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.051916 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c98cb8667-nk2d7"] Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.053512 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c98cb8667-nk2d7" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.062551 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c98cb8667-nk2d7"] Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.085654 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74rtl\" (UniqueName: \"kubernetes.io/projected/f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce-kube-api-access-74rtl\") pod \"dnsmasq-dns-58cfb8dc65-t9s4x\" (UID: \"f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce\") " pod="openstack/dnsmasq-dns-58cfb8dc65-t9s4x" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.239341 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5brx\" (UniqueName: \"kubernetes.io/projected/be5227a5-2fd9-4e47-ba91-e96d06ad9700-kube-api-access-c5brx\") pod \"dnsmasq-dns-5c98cb8667-nk2d7\" (UID: \"be5227a5-2fd9-4e47-ba91-e96d06ad9700\") " pod="openstack/dnsmasq-dns-5c98cb8667-nk2d7" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.239459 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be5227a5-2fd9-4e47-ba91-e96d06ad9700-dns-svc\") pod \"dnsmasq-dns-5c98cb8667-nk2d7\" (UID: \"be5227a5-2fd9-4e47-ba91-e96d06ad9700\") " pod="openstack/dnsmasq-dns-5c98cb8667-nk2d7" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.239488 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be5227a5-2fd9-4e47-ba91-e96d06ad9700-config\") pod \"dnsmasq-dns-5c98cb8667-nk2d7\" (UID: \"be5227a5-2fd9-4e47-ba91-e96d06ad9700\") " pod="openstack/dnsmasq-dns-5c98cb8667-nk2d7" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.341171 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be5227a5-2fd9-4e47-ba91-e96d06ad9700-dns-svc\") pod \"dnsmasq-dns-5c98cb8667-nk2d7\" (UID: \"be5227a5-2fd9-4e47-ba91-e96d06ad9700\") " pod="openstack/dnsmasq-dns-5c98cb8667-nk2d7" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.341233 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be5227a5-2fd9-4e47-ba91-e96d06ad9700-config\") pod \"dnsmasq-dns-5c98cb8667-nk2d7\" (UID: \"be5227a5-2fd9-4e47-ba91-e96d06ad9700\") " pod="openstack/dnsmasq-dns-5c98cb8667-nk2d7" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.341364 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5brx\" (UniqueName: \"kubernetes.io/projected/be5227a5-2fd9-4e47-ba91-e96d06ad9700-kube-api-access-c5brx\") pod \"dnsmasq-dns-5c98cb8667-nk2d7\" (UID: \"be5227a5-2fd9-4e47-ba91-e96d06ad9700\") " pod="openstack/dnsmasq-dns-5c98cb8667-nk2d7" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.342637 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be5227a5-2fd9-4e47-ba91-e96d06ad9700-dns-svc\") pod \"dnsmasq-dns-5c98cb8667-nk2d7\" (UID: \"be5227a5-2fd9-4e47-ba91-e96d06ad9700\") " pod="openstack/dnsmasq-dns-5c98cb8667-nk2d7" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.343033 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be5227a5-2fd9-4e47-ba91-e96d06ad9700-config\") pod \"dnsmasq-dns-5c98cb8667-nk2d7\" (UID: \"be5227a5-2fd9-4e47-ba91-e96d06ad9700\") " pod="openstack/dnsmasq-dns-5c98cb8667-nk2d7" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.355850 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5brx\" (UniqueName: \"kubernetes.io/projected/be5227a5-2fd9-4e47-ba91-e96d06ad9700-kube-api-access-c5brx\") pod \"dnsmasq-dns-5c98cb8667-nk2d7\" (UID: \"be5227a5-2fd9-4e47-ba91-e96d06ad9700\") " pod="openstack/dnsmasq-dns-5c98cb8667-nk2d7" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.369573 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58cfb8dc65-t9s4x" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.388411 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c98cb8667-nk2d7" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.784893 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58cfb8dc65-t9s4x"] Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.886538 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c98cb8667-nk2d7"] Oct 02 06:59:05 crc kubenswrapper[4786]: W1002 06:59:05.890063 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe5227a5_2fd9_4e47_ba91_e96d06ad9700.slice/crio-5234241b94f3231218364508c16eccf7f318ea531f4b503967c105850de4fbd0 WatchSource:0}: Error finding container 5234241b94f3231218364508c16eccf7f318ea531f4b503967c105850de4fbd0: Status 404 returned error can't find the container with id 5234241b94f3231218364508c16eccf7f318ea531f4b503967c105850de4fbd0 Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.934189 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.947763 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.947908 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.951072 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.951099 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.951076 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.951083 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vbf4k" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.951221 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.951600 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 02 06:59:05 crc kubenswrapper[4786]: I1002 06:59:05.956095 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.032081 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58cfb8dc65-t9s4x" event={"ID":"f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce","Type":"ContainerStarted","Data":"212dbbeea78ffe08985183a8740c74fe67f10721663f8e24a08da4b7b89e3d9a"} Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.033430 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c98cb8667-nk2d7" event={"ID":"be5227a5-2fd9-4e47-ba91-e96d06ad9700","Type":"ContainerStarted","Data":"5234241b94f3231218364508c16eccf7f318ea531f4b503967c105850de4fbd0"} Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.049889 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.050077 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.050112 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a34253b-beed-468f-8bad-82366a5eb5c3-config-data\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.050160 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7a34253b-beed-468f-8bad-82366a5eb5c3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.050179 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.050283 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7a34253b-beed-468f-8bad-82366a5eb5c3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.050418 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.050475 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7a34253b-beed-468f-8bad-82366a5eb5c3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.050523 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.050618 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7a34253b-beed-468f-8bad-82366a5eb5c3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.050642 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99rrf\" (UniqueName: \"kubernetes.io/projected/7a34253b-beed-468f-8bad-82366a5eb5c3-kube-api-access-99rrf\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.151758 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.151799 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a34253b-beed-468f-8bad-82366a5eb5c3-config-data\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.151864 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7a34253b-beed-468f-8bad-82366a5eb5c3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.151882 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.151903 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7a34253b-beed-468f-8bad-82366a5eb5c3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.151931 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.151946 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7a34253b-beed-468f-8bad-82366a5eb5c3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.151990 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.152023 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7a34253b-beed-468f-8bad-82366a5eb5c3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.152039 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99rrf\" (UniqueName: \"kubernetes.io/projected/7a34253b-beed-468f-8bad-82366a5eb5c3-kube-api-access-99rrf\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.152069 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.154636 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.155957 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.156211 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a34253b-beed-468f-8bad-82366a5eb5c3-config-data\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.156408 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.156505 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7a34253b-beed-468f-8bad-82366a5eb5c3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.157799 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7a34253b-beed-468f-8bad-82366a5eb5c3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.158175 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.159037 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.159531 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7a34253b-beed-468f-8bad-82366a5eb5c3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.162503 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7a34253b-beed-468f-8bad-82366a5eb5c3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.169085 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99rrf\" (UniqueName: \"kubernetes.io/projected/7a34253b-beed-468f-8bad-82366a5eb5c3-kube-api-access-99rrf\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.172234 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.194269 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.196889 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.197047 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.199993 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4p8pv" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.200096 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.200132 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.200134 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.200217 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.200451 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.200566 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.275454 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.355929 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.356005 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.356070 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.356236 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g4zj\" (UniqueName: \"kubernetes.io/projected/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-kube-api-access-4g4zj\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.356282 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.356329 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.356355 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.356394 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.356473 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.356559 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.356625 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.458237 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.458281 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.458306 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.458355 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g4zj\" (UniqueName: \"kubernetes.io/projected/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-kube-api-access-4g4zj\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.458373 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.458392 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.458406 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.458424 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.458451 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.458480 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.458505 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.458807 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.460013 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.460071 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.460883 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.461235 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.461559 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.466770 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.467237 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.467522 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.467980 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.475206 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g4zj\" (UniqueName: \"kubernetes.io/projected/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-kube-api-access-4g4zj\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.479309 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.516596 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.645785 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 06:59:06 crc kubenswrapper[4786]: W1002 06:59:06.656265 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a34253b_beed_468f_8bad_82366a5eb5c3.slice/crio-cd330b6f989a2bff319e87567b1f4b2c6b8b8a75685ceb62a02de19845376123 WatchSource:0}: Error finding container cd330b6f989a2bff319e87567b1f4b2c6b8b8a75685ceb62a02de19845376123: Status 404 returned error can't find the container with id cd330b6f989a2bff319e87567b1f4b2c6b8b8a75685ceb62a02de19845376123 Oct 02 06:59:06 crc kubenswrapper[4786]: I1002 06:59:06.882826 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 06:59:06 crc kubenswrapper[4786]: W1002 06:59:06.889532 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8d55d69_26a7_4b9f_9b6f_d63b0dabfb94.slice/crio-541088e2f41b0e4885c54ba993463851753f6e92f27130e5f9d6c41d6d95886a WatchSource:0}: Error finding container 541088e2f41b0e4885c54ba993463851753f6e92f27130e5f9d6c41d6d95886a: Status 404 returned error can't find the container with id 541088e2f41b0e4885c54ba993463851753f6e92f27130e5f9d6c41d6d95886a Oct 02 06:59:07 crc kubenswrapper[4786]: I1002 06:59:07.043233 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94","Type":"ContainerStarted","Data":"541088e2f41b0e4885c54ba993463851753f6e92f27130e5f9d6c41d6d95886a"} Oct 02 06:59:07 crc kubenswrapper[4786]: I1002 06:59:07.044345 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7a34253b-beed-468f-8bad-82366a5eb5c3","Type":"ContainerStarted","Data":"cd330b6f989a2bff319e87567b1f4b2c6b8b8a75685ceb62a02de19845376123"} Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.385913 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.387494 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.391335 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.392083 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-hz65k" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.392134 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.392213 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.392597 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.394756 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.395867 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.498597 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/106dd908-3398-489e-a39f-b684b5eecd2b-secrets\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.498676 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/106dd908-3398-489e-a39f-b684b5eecd2b-config-data-default\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.498729 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.498754 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/106dd908-3398-489e-a39f-b684b5eecd2b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.498813 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/106dd908-3398-489e-a39f-b684b5eecd2b-kolla-config\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.498857 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/106dd908-3398-489e-a39f-b684b5eecd2b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.498888 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/106dd908-3398-489e-a39f-b684b5eecd2b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.498906 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9bqx\" (UniqueName: \"kubernetes.io/projected/106dd908-3398-489e-a39f-b684b5eecd2b-kube-api-access-x9bqx\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.498933 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106dd908-3398-489e-a39f-b684b5eecd2b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.601192 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/106dd908-3398-489e-a39f-b684b5eecd2b-secrets\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.601252 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/106dd908-3398-489e-a39f-b684b5eecd2b-config-data-default\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.601285 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.601312 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/106dd908-3398-489e-a39f-b684b5eecd2b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.601383 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/106dd908-3398-489e-a39f-b684b5eecd2b-kolla-config\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.601424 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/106dd908-3398-489e-a39f-b684b5eecd2b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.601453 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/106dd908-3398-489e-a39f-b684b5eecd2b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.601468 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9bqx\" (UniqueName: \"kubernetes.io/projected/106dd908-3398-489e-a39f-b684b5eecd2b-kube-api-access-x9bqx\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.601491 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106dd908-3398-489e-a39f-b684b5eecd2b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.602050 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.602562 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/106dd908-3398-489e-a39f-b684b5eecd2b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.602885 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/106dd908-3398-489e-a39f-b684b5eecd2b-kolla-config\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.603275 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/106dd908-3398-489e-a39f-b684b5eecd2b-config-data-default\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.606968 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/106dd908-3398-489e-a39f-b684b5eecd2b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.608778 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/106dd908-3398-489e-a39f-b684b5eecd2b-secrets\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.612431 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/106dd908-3398-489e-a39f-b684b5eecd2b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.615087 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106dd908-3398-489e-a39f-b684b5eecd2b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.622420 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.630608 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9bqx\" (UniqueName: \"kubernetes.io/projected/106dd908-3398-489e-a39f-b684b5eecd2b-kube-api-access-x9bqx\") pod \"openstack-galera-0\" (UID: \"106dd908-3398-489e-a39f-b684b5eecd2b\") " pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.717851 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.850945 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.852597 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.854512 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.854775 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.854848 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.856899 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-svnbr" Oct 02 06:59:08 crc kubenswrapper[4786]: I1002 06:59:08.860756 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.010478 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wgf6\" (UniqueName: \"kubernetes.io/projected/e4c62a8c-eb69-4f97-a299-a44f87315f81-kube-api-access-4wgf6\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.010534 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.010566 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e4c62a8c-eb69-4f97-a299-a44f87315f81-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.010622 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e4c62a8c-eb69-4f97-a299-a44f87315f81-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.010651 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4c62a8c-eb69-4f97-a299-a44f87315f81-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.010912 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4c62a8c-eb69-4f97-a299-a44f87315f81-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.010970 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4c62a8c-eb69-4f97-a299-a44f87315f81-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.010996 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e4c62a8c-eb69-4f97-a299-a44f87315f81-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.011018 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c62a8c-eb69-4f97-a299-a44f87315f81-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.105769 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.112429 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e4c62a8c-eb69-4f97-a299-a44f87315f81-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.112477 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4c62a8c-eb69-4f97-a299-a44f87315f81-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.112531 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4c62a8c-eb69-4f97-a299-a44f87315f81-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.112554 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4c62a8c-eb69-4f97-a299-a44f87315f81-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.112573 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e4c62a8c-eb69-4f97-a299-a44f87315f81-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.112592 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c62a8c-eb69-4f97-a299-a44f87315f81-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.112708 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wgf6\" (UniqueName: \"kubernetes.io/projected/e4c62a8c-eb69-4f97-a299-a44f87315f81-kube-api-access-4wgf6\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.112745 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.112770 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e4c62a8c-eb69-4f97-a299-a44f87315f81-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.113671 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e4c62a8c-eb69-4f97-a299-a44f87315f81-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.113928 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e4c62a8c-eb69-4f97-a299-a44f87315f81-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.114528 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4c62a8c-eb69-4f97-a299-a44f87315f81-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.120704 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e4c62a8c-eb69-4f97-a299-a44f87315f81-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.120779 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c62a8c-eb69-4f97-a299-a44f87315f81-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.120951 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e4c62a8c-eb69-4f97-a299-a44f87315f81-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.121887 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.123207 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4c62a8c-eb69-4f97-a299-a44f87315f81-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.132493 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wgf6\" (UniqueName: \"kubernetes.io/projected/e4c62a8c-eb69-4f97-a299-a44f87315f81-kube-api-access-4wgf6\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.144566 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e4c62a8c-eb69-4f97-a299-a44f87315f81\") " pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.177988 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.465474 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.469995 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.475998 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.476206 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-4ms74" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.476334 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.485826 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.607869 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.625121 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp56m\" (UniqueName: \"kubernetes.io/projected/26e71aaf-fc49-4218-b001-433de642f9ae-kube-api-access-cp56m\") pod \"memcached-0\" (UID: \"26e71aaf-fc49-4218-b001-433de642f9ae\") " pod="openstack/memcached-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.625162 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/26e71aaf-fc49-4218-b001-433de642f9ae-kolla-config\") pod \"memcached-0\" (UID: \"26e71aaf-fc49-4218-b001-433de642f9ae\") " pod="openstack/memcached-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.625268 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e71aaf-fc49-4218-b001-433de642f9ae-combined-ca-bundle\") pod \"memcached-0\" (UID: \"26e71aaf-fc49-4218-b001-433de642f9ae\") " pod="openstack/memcached-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.625295 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26e71aaf-fc49-4218-b001-433de642f9ae-config-data\") pod \"memcached-0\" (UID: \"26e71aaf-fc49-4218-b001-433de642f9ae\") " pod="openstack/memcached-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.625315 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/26e71aaf-fc49-4218-b001-433de642f9ae-memcached-tls-certs\") pod \"memcached-0\" (UID: \"26e71aaf-fc49-4218-b001-433de642f9ae\") " pod="openstack/memcached-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.727402 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e71aaf-fc49-4218-b001-433de642f9ae-combined-ca-bundle\") pod \"memcached-0\" (UID: \"26e71aaf-fc49-4218-b001-433de642f9ae\") " pod="openstack/memcached-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.727470 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26e71aaf-fc49-4218-b001-433de642f9ae-config-data\") pod \"memcached-0\" (UID: \"26e71aaf-fc49-4218-b001-433de642f9ae\") " pod="openstack/memcached-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.727492 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/26e71aaf-fc49-4218-b001-433de642f9ae-memcached-tls-certs\") pod \"memcached-0\" (UID: \"26e71aaf-fc49-4218-b001-433de642f9ae\") " pod="openstack/memcached-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.727660 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp56m\" (UniqueName: \"kubernetes.io/projected/26e71aaf-fc49-4218-b001-433de642f9ae-kube-api-access-cp56m\") pod \"memcached-0\" (UID: \"26e71aaf-fc49-4218-b001-433de642f9ae\") " pod="openstack/memcached-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.727726 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/26e71aaf-fc49-4218-b001-433de642f9ae-kolla-config\") pod \"memcached-0\" (UID: \"26e71aaf-fc49-4218-b001-433de642f9ae\") " pod="openstack/memcached-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.728209 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/26e71aaf-fc49-4218-b001-433de642f9ae-config-data\") pod \"memcached-0\" (UID: \"26e71aaf-fc49-4218-b001-433de642f9ae\") " pod="openstack/memcached-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.728433 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/26e71aaf-fc49-4218-b001-433de642f9ae-kolla-config\") pod \"memcached-0\" (UID: \"26e71aaf-fc49-4218-b001-433de642f9ae\") " pod="openstack/memcached-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.736517 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e71aaf-fc49-4218-b001-433de642f9ae-combined-ca-bundle\") pod \"memcached-0\" (UID: \"26e71aaf-fc49-4218-b001-433de642f9ae\") " pod="openstack/memcached-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.741146 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/26e71aaf-fc49-4218-b001-433de642f9ae-memcached-tls-certs\") pod \"memcached-0\" (UID: \"26e71aaf-fc49-4218-b001-433de642f9ae\") " pod="openstack/memcached-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.743086 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp56m\" (UniqueName: \"kubernetes.io/projected/26e71aaf-fc49-4218-b001-433de642f9ae-kube-api-access-cp56m\") pod \"memcached-0\" (UID: \"26e71aaf-fc49-4218-b001-433de642f9ae\") " pod="openstack/memcached-0" Oct 02 06:59:09 crc kubenswrapper[4786]: I1002 06:59:09.798005 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 06:59:10 crc kubenswrapper[4786]: I1002 06:59:10.074633 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"106dd908-3398-489e-a39f-b684b5eecd2b","Type":"ContainerStarted","Data":"82636cb3be39df51bd1442a290cceec94c2d0b67435da44fb4db84223de7cc6e"} Oct 02 06:59:10 crc kubenswrapper[4786]: I1002 06:59:10.076255 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e4c62a8c-eb69-4f97-a299-a44f87315f81","Type":"ContainerStarted","Data":"03802a2c98b6919963752c73b431bcf9bb864f67c8608c657f81636d0d566c00"} Oct 02 06:59:10 crc kubenswrapper[4786]: I1002 06:59:10.190047 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 02 06:59:10 crc kubenswrapper[4786]: W1002 06:59:10.192614 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26e71aaf_fc49_4218_b001_433de642f9ae.slice/crio-c1b7d23f577ccc165c255af0a0e35c046b510b2cc796f4104d0a877ebc05d559 WatchSource:0}: Error finding container c1b7d23f577ccc165c255af0a0e35c046b510b2cc796f4104d0a877ebc05d559: Status 404 returned error can't find the container with id c1b7d23f577ccc165c255af0a0e35c046b510b2cc796f4104d0a877ebc05d559 Oct 02 06:59:11 crc kubenswrapper[4786]: I1002 06:59:11.076193 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 06:59:11 crc kubenswrapper[4786]: I1002 06:59:11.079708 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 06:59:11 crc kubenswrapper[4786]: I1002 06:59:11.080595 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 06:59:11 crc kubenswrapper[4786]: I1002 06:59:11.083199 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-s6nlm" Oct 02 06:59:11 crc kubenswrapper[4786]: I1002 06:59:11.088830 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"26e71aaf-fc49-4218-b001-433de642f9ae","Type":"ContainerStarted","Data":"c1b7d23f577ccc165c255af0a0e35c046b510b2cc796f4104d0a877ebc05d559"} Oct 02 06:59:11 crc kubenswrapper[4786]: I1002 06:59:11.171543 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvql2\" (UniqueName: \"kubernetes.io/projected/565ac7e2-35f2-4085-96f7-d6f78e14a4e2-kube-api-access-tvql2\") pod \"kube-state-metrics-0\" (UID: \"565ac7e2-35f2-4085-96f7-d6f78e14a4e2\") " pod="openstack/kube-state-metrics-0" Oct 02 06:59:11 crc kubenswrapper[4786]: I1002 06:59:11.273741 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvql2\" (UniqueName: \"kubernetes.io/projected/565ac7e2-35f2-4085-96f7-d6f78e14a4e2-kube-api-access-tvql2\") pod \"kube-state-metrics-0\" (UID: \"565ac7e2-35f2-4085-96f7-d6f78e14a4e2\") " pod="openstack/kube-state-metrics-0" Oct 02 06:59:11 crc kubenswrapper[4786]: I1002 06:59:11.290275 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvql2\" (UniqueName: \"kubernetes.io/projected/565ac7e2-35f2-4085-96f7-d6f78e14a4e2-kube-api-access-tvql2\") pod \"kube-state-metrics-0\" (UID: \"565ac7e2-35f2-4085-96f7-d6f78e14a4e2\") " pod="openstack/kube-state-metrics-0" Oct 02 06:59:11 crc kubenswrapper[4786]: I1002 06:59:11.405820 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 06:59:11 crc kubenswrapper[4786]: I1002 06:59:11.829281 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 06:59:12 crc kubenswrapper[4786]: I1002 06:59:12.098241 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"565ac7e2-35f2-4085-96f7-d6f78e14a4e2","Type":"ContainerStarted","Data":"b49d156d3571119acc3a0afaa2121ba79dcdb836c12d5aec7c4063e04cf7ce7e"} Oct 02 06:59:15 crc kubenswrapper[4786]: I1002 06:59:15.148022 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"565ac7e2-35f2-4085-96f7-d6f78e14a4e2","Type":"ContainerStarted","Data":"3fa220d510a22483f1378b3290e8512a5b4feaaa433f861fb766fda6f9fafb08"} Oct 02 06:59:15 crc kubenswrapper[4786]: I1002 06:59:15.148588 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 02 06:59:15 crc kubenswrapper[4786]: I1002 06:59:15.165228 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.747721523 podStartE2EDuration="4.165212179s" podCreationTimestamp="2025-10-02 06:59:11 +0000 UTC" firstStartedPulling="2025-10-02 06:59:11.843127067 +0000 UTC m=+761.964310198" lastFinishedPulling="2025-10-02 06:59:14.260617722 +0000 UTC m=+764.381800854" observedRunningTime="2025-10-02 06:59:15.16092805 +0000 UTC m=+765.282111191" watchObservedRunningTime="2025-10-02 06:59:15.165212179 +0000 UTC m=+765.286395310" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.114141 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-gl77n"] Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.115356 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.117720 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.117800 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-clg6h" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.117814 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.123976 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gl77n"] Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.160491 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-lf7tj"] Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.163245 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f445s\" (UniqueName: \"kubernetes.io/projected/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-kube-api-access-f445s\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.163325 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-var-log-ovn\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.163365 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-ovn-controller-tls-certs\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.163389 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-var-run-ovn\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.163446 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-var-run\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.163480 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-scripts\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.163581 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-combined-ca-bundle\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.174796 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.177205 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-lf7tj"] Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.265982 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/452e4743-083e-420b-9dfc-ea81e1376373-var-log\") pod \"ovn-controller-ovs-lf7tj\" (UID: \"452e4743-083e-420b-9dfc-ea81e1376373\") " pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.266050 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-combined-ca-bundle\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.266084 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/452e4743-083e-420b-9dfc-ea81e1376373-var-lib\") pod \"ovn-controller-ovs-lf7tj\" (UID: \"452e4743-083e-420b-9dfc-ea81e1376373\") " pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.266112 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/452e4743-083e-420b-9dfc-ea81e1376373-var-run\") pod \"ovn-controller-ovs-lf7tj\" (UID: \"452e4743-083e-420b-9dfc-ea81e1376373\") " pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.266151 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f445s\" (UniqueName: \"kubernetes.io/projected/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-kube-api-access-f445s\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.266182 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/452e4743-083e-420b-9dfc-ea81e1376373-etc-ovs\") pod \"ovn-controller-ovs-lf7tj\" (UID: \"452e4743-083e-420b-9dfc-ea81e1376373\") " pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.266203 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-var-log-ovn\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.266220 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-ovn-controller-tls-certs\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.266244 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-var-run-ovn\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.266310 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-var-run\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.266346 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-scripts\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.266386 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92rtj\" (UniqueName: \"kubernetes.io/projected/452e4743-083e-420b-9dfc-ea81e1376373-kube-api-access-92rtj\") pod \"ovn-controller-ovs-lf7tj\" (UID: \"452e4743-083e-420b-9dfc-ea81e1376373\") " pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.266429 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/452e4743-083e-420b-9dfc-ea81e1376373-scripts\") pod \"ovn-controller-ovs-lf7tj\" (UID: \"452e4743-083e-420b-9dfc-ea81e1376373\") " pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.267096 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-var-log-ovn\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.267350 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-var-run\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.270724 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-var-run-ovn\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.287468 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-scripts\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.313713 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f445s\" (UniqueName: \"kubernetes.io/projected/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-kube-api-access-f445s\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.317925 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-ovn-controller-tls-certs\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.333566 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b71af7-f7f4-45e8-b8f0-c7428f54a37d-combined-ca-bundle\") pod \"ovn-controller-gl77n\" (UID: \"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d\") " pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.379518 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92rtj\" (UniqueName: \"kubernetes.io/projected/452e4743-083e-420b-9dfc-ea81e1376373-kube-api-access-92rtj\") pod \"ovn-controller-ovs-lf7tj\" (UID: \"452e4743-083e-420b-9dfc-ea81e1376373\") " pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.379628 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/452e4743-083e-420b-9dfc-ea81e1376373-scripts\") pod \"ovn-controller-ovs-lf7tj\" (UID: \"452e4743-083e-420b-9dfc-ea81e1376373\") " pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.379671 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/452e4743-083e-420b-9dfc-ea81e1376373-var-log\") pod \"ovn-controller-ovs-lf7tj\" (UID: \"452e4743-083e-420b-9dfc-ea81e1376373\") " pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.379735 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/452e4743-083e-420b-9dfc-ea81e1376373-var-lib\") pod \"ovn-controller-ovs-lf7tj\" (UID: \"452e4743-083e-420b-9dfc-ea81e1376373\") " pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.379762 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/452e4743-083e-420b-9dfc-ea81e1376373-var-run\") pod \"ovn-controller-ovs-lf7tj\" (UID: \"452e4743-083e-420b-9dfc-ea81e1376373\") " pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.379816 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/452e4743-083e-420b-9dfc-ea81e1376373-etc-ovs\") pod \"ovn-controller-ovs-lf7tj\" (UID: \"452e4743-083e-420b-9dfc-ea81e1376373\") " pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.380039 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/452e4743-083e-420b-9dfc-ea81e1376373-var-log\") pod \"ovn-controller-ovs-lf7tj\" (UID: \"452e4743-083e-420b-9dfc-ea81e1376373\") " pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.380109 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/452e4743-083e-420b-9dfc-ea81e1376373-etc-ovs\") pod \"ovn-controller-ovs-lf7tj\" (UID: \"452e4743-083e-420b-9dfc-ea81e1376373\") " pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.380245 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/452e4743-083e-420b-9dfc-ea81e1376373-var-lib\") pod \"ovn-controller-ovs-lf7tj\" (UID: \"452e4743-083e-420b-9dfc-ea81e1376373\") " pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.380258 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/452e4743-083e-420b-9dfc-ea81e1376373-var-run\") pod \"ovn-controller-ovs-lf7tj\" (UID: \"452e4743-083e-420b-9dfc-ea81e1376373\") " pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.382541 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/452e4743-083e-420b-9dfc-ea81e1376373-scripts\") pod \"ovn-controller-ovs-lf7tj\" (UID: \"452e4743-083e-420b-9dfc-ea81e1376373\") " pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.394109 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92rtj\" (UniqueName: \"kubernetes.io/projected/452e4743-083e-420b-9dfc-ea81e1376373-kube-api-access-92rtj\") pod \"ovn-controller-ovs-lf7tj\" (UID: \"452e4743-083e-420b-9dfc-ea81e1376373\") " pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.438751 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gl77n" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.540928 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 06:59:16 crc kubenswrapper[4786]: I1002 06:59:16.851769 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gl77n"] Oct 02 06:59:16 crc kubenswrapper[4786]: W1002 06:59:16.859269 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0b71af7_f7f4_45e8_b8f0_c7428f54a37d.slice/crio-ded5e87324a6ebeffd96ffd96123711ef991a80707ce402d682c36d576f8cc1e WatchSource:0}: Error finding container ded5e87324a6ebeffd96ffd96123711ef991a80707ce402d682c36d576f8cc1e: Status 404 returned error can't find the container with id ded5e87324a6ebeffd96ffd96123711ef991a80707ce402d682c36d576f8cc1e Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.132191 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-lf7tj"] Oct 02 06:59:17 crc kubenswrapper[4786]: W1002 06:59:17.138732 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod452e4743_083e_420b_9dfc_ea81e1376373.slice/crio-e3abf91ef553b5a3365348f9a421136ae813d5990280e2089457b44ccaf7ee37 WatchSource:0}: Error finding container e3abf91ef553b5a3365348f9a421136ae813d5990280e2089457b44ccaf7ee37: Status 404 returned error can't find the container with id e3abf91ef553b5a3365348f9a421136ae813d5990280e2089457b44ccaf7ee37 Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.162094 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-jz66d"] Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.172478 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-jz66d"] Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.172578 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.173990 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.174185 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.195190 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/003620a3-92b9-4640-be9e-9a8b064fc888-ovs-rundir\") pod \"ovn-controller-metrics-jz66d\" (UID: \"003620a3-92b9-4640-be9e-9a8b064fc888\") " pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.195248 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/003620a3-92b9-4640-be9e-9a8b064fc888-ovn-rundir\") pod \"ovn-controller-metrics-jz66d\" (UID: \"003620a3-92b9-4640-be9e-9a8b064fc888\") " pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.195446 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46hmh\" (UniqueName: \"kubernetes.io/projected/003620a3-92b9-4640-be9e-9a8b064fc888-kube-api-access-46hmh\") pod \"ovn-controller-metrics-jz66d\" (UID: \"003620a3-92b9-4640-be9e-9a8b064fc888\") " pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.195642 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003620a3-92b9-4640-be9e-9a8b064fc888-config\") pod \"ovn-controller-metrics-jz66d\" (UID: \"003620a3-92b9-4640-be9e-9a8b064fc888\") " pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.195730 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003620a3-92b9-4640-be9e-9a8b064fc888-combined-ca-bundle\") pod \"ovn-controller-metrics-jz66d\" (UID: \"003620a3-92b9-4640-be9e-9a8b064fc888\") " pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.195850 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/003620a3-92b9-4640-be9e-9a8b064fc888-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jz66d\" (UID: \"003620a3-92b9-4640-be9e-9a8b064fc888\") " pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.239993 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gl77n" event={"ID":"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d","Type":"ContainerStarted","Data":"ded5e87324a6ebeffd96ffd96123711ef991a80707ce402d682c36d576f8cc1e"} Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.241589 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lf7tj" event={"ID":"452e4743-083e-420b-9dfc-ea81e1376373","Type":"ContainerStarted","Data":"e3abf91ef553b5a3365348f9a421136ae813d5990280e2089457b44ccaf7ee37"} Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.296859 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003620a3-92b9-4640-be9e-9a8b064fc888-combined-ca-bundle\") pod \"ovn-controller-metrics-jz66d\" (UID: \"003620a3-92b9-4640-be9e-9a8b064fc888\") " pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.296958 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/003620a3-92b9-4640-be9e-9a8b064fc888-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jz66d\" (UID: \"003620a3-92b9-4640-be9e-9a8b064fc888\") " pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.297028 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/003620a3-92b9-4640-be9e-9a8b064fc888-ovs-rundir\") pod \"ovn-controller-metrics-jz66d\" (UID: \"003620a3-92b9-4640-be9e-9a8b064fc888\") " pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.297081 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/003620a3-92b9-4640-be9e-9a8b064fc888-ovn-rundir\") pod \"ovn-controller-metrics-jz66d\" (UID: \"003620a3-92b9-4640-be9e-9a8b064fc888\") " pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.297147 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46hmh\" (UniqueName: \"kubernetes.io/projected/003620a3-92b9-4640-be9e-9a8b064fc888-kube-api-access-46hmh\") pod \"ovn-controller-metrics-jz66d\" (UID: \"003620a3-92b9-4640-be9e-9a8b064fc888\") " pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.297205 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003620a3-92b9-4640-be9e-9a8b064fc888-config\") pod \"ovn-controller-metrics-jz66d\" (UID: \"003620a3-92b9-4640-be9e-9a8b064fc888\") " pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.297508 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/003620a3-92b9-4640-be9e-9a8b064fc888-ovn-rundir\") pod \"ovn-controller-metrics-jz66d\" (UID: \"003620a3-92b9-4640-be9e-9a8b064fc888\") " pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.297589 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/003620a3-92b9-4640-be9e-9a8b064fc888-ovs-rundir\") pod \"ovn-controller-metrics-jz66d\" (UID: \"003620a3-92b9-4640-be9e-9a8b064fc888\") " pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.297857 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003620a3-92b9-4640-be9e-9a8b064fc888-config\") pod \"ovn-controller-metrics-jz66d\" (UID: \"003620a3-92b9-4640-be9e-9a8b064fc888\") " pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.304379 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003620a3-92b9-4640-be9e-9a8b064fc888-combined-ca-bundle\") pod \"ovn-controller-metrics-jz66d\" (UID: \"003620a3-92b9-4640-be9e-9a8b064fc888\") " pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.311192 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/003620a3-92b9-4640-be9e-9a8b064fc888-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jz66d\" (UID: \"003620a3-92b9-4640-be9e-9a8b064fc888\") " pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.315116 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46hmh\" (UniqueName: \"kubernetes.io/projected/003620a3-92b9-4640-be9e-9a8b064fc888-kube-api-access-46hmh\") pod \"ovn-controller-metrics-jz66d\" (UID: \"003620a3-92b9-4640-be9e-9a8b064fc888\") " pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.497891 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jz66d" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.929644 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-jz66d"] Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.966880 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.968193 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.971866 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.972309 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.972492 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.972730 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-42mmt" Oct 02 06:59:17 crc kubenswrapper[4786]: I1002 06:59:17.998201 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.114201 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.114362 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-config\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.114393 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.114419 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.114541 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.114663 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsp5b\" (UniqueName: \"kubernetes.io/projected/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-kube-api-access-dsp5b\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.114712 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.114735 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.217306 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.217469 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsp5b\" (UniqueName: \"kubernetes.io/projected/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-kube-api-access-dsp5b\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.217523 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.217546 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.217591 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.217709 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-config\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.217737 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.217769 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.218579 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.219021 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.221133 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-config\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.222034 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.227492 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.228504 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.233402 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.235287 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsp5b\" (UniqueName: \"kubernetes.io/projected/11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9-kube-api-access-dsp5b\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.240601 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9\") " pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.254241 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jz66d" event={"ID":"003620a3-92b9-4640-be9e-9a8b064fc888","Type":"ContainerStarted","Data":"5f75c560b914b658ed1397d2799bf9f7e4b18cb278ba8354c15ebb489eec1931"} Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.297194 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.821424 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.841718 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.843058 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.848362 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.853215 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-4ht96" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.853286 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.853837 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 02 06:59:18 crc kubenswrapper[4786]: I1002 06:59:18.854369 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.033295 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.033555 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bttvb\" (UniqueName: \"kubernetes.io/projected/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-kube-api-access-bttvb\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.033737 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-config\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.033796 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.033836 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.033862 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.033923 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.033988 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.135931 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bttvb\" (UniqueName: \"kubernetes.io/projected/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-kube-api-access-bttvb\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.136006 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-config\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.136059 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.136089 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.136113 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.136139 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.136185 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.136226 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.136963 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.137239 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-config\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.137462 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.138272 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.147973 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.152373 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.152642 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.157674 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bttvb\" (UniqueName: \"kubernetes.io/projected/9edb2cb3-f47c-4f56-8181-1ec8f9d774f6-kube-api-access-bttvb\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.165367 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6\") " pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.280677 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9","Type":"ContainerStarted","Data":"de1c91614bf99d3646d10586e944415cf78ca28c66711dbf20cefc65b6b32507"} Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.461783 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 06:59:19 crc kubenswrapper[4786]: I1002 06:59:19.900252 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 06:59:20 crc kubenswrapper[4786]: W1002 06:59:20.837327 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9edb2cb3_f47c_4f56_8181_1ec8f9d774f6.slice/crio-70e94b777697e3385f175966b671372386a4b5b0970dd3e544796bb801c7b29f WatchSource:0}: Error finding container 70e94b777697e3385f175966b671372386a4b5b0970dd3e544796bb801c7b29f: Status 404 returned error can't find the container with id 70e94b777697e3385f175966b671372386a4b5b0970dd3e544796bb801c7b29f Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.298589 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jz66d" event={"ID":"003620a3-92b9-4640-be9e-9a8b064fc888","Type":"ContainerStarted","Data":"f3dc32eb5aca2d9d80505e645b1f97eee894f28fb6f74fda96f87a4bfe7fea7a"} Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.299868 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6","Type":"ContainerStarted","Data":"70e94b777697e3385f175966b671372386a4b5b0970dd3e544796bb801c7b29f"} Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.410159 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.420640 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-jz66d" podStartSLOduration=1.373136438 podStartE2EDuration="4.420627285s" podCreationTimestamp="2025-10-02 06:59:17 +0000 UTC" firstStartedPulling="2025-10-02 06:59:17.946733215 +0000 UTC m=+768.067916346" lastFinishedPulling="2025-10-02 06:59:20.994224061 +0000 UTC m=+771.115407193" observedRunningTime="2025-10-02 06:59:21.314149801 +0000 UTC m=+771.435333022" watchObservedRunningTime="2025-10-02 06:59:21.420627285 +0000 UTC m=+771.541810416" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.508956 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58cfb8dc65-t9s4x"] Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.539789 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c7fbc4849-8ffsc"] Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.541011 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.543788 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c7fbc4849-8ffsc"] Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.543908 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.616388 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c98cb8667-nk2d7"] Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.636721 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ddc467bbc-h5rtm"] Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.638041 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.641889 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.650266 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ddc467bbc-h5rtm"] Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.687449 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c7fbc4849-8ffsc\" (UID: \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\") " pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.687610 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-dns-svc\") pod \"dnsmasq-dns-6c7fbc4849-8ffsc\" (UID: \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\") " pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.687739 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppgrd\" (UniqueName: \"kubernetes.io/projected/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-kube-api-access-ppgrd\") pod \"dnsmasq-dns-6c7fbc4849-8ffsc\" (UID: \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\") " pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.687836 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-config\") pod \"dnsmasq-dns-6c7fbc4849-8ffsc\" (UID: \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\") " pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.789357 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-config\") pod \"dnsmasq-dns-5ddc467bbc-h5rtm\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.789410 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppgrd\" (UniqueName: \"kubernetes.io/projected/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-kube-api-access-ppgrd\") pod \"dnsmasq-dns-6c7fbc4849-8ffsc\" (UID: \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\") " pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.789443 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddc467bbc-h5rtm\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.789463 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddc467bbc-h5rtm\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.789484 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqkh4\" (UniqueName: \"kubernetes.io/projected/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-kube-api-access-lqkh4\") pod \"dnsmasq-dns-5ddc467bbc-h5rtm\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.789503 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-config\") pod \"dnsmasq-dns-6c7fbc4849-8ffsc\" (UID: \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\") " pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.789539 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-dns-svc\") pod \"dnsmasq-dns-5ddc467bbc-h5rtm\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.789606 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c7fbc4849-8ffsc\" (UID: \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\") " pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.789623 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-dns-svc\") pod \"dnsmasq-dns-6c7fbc4849-8ffsc\" (UID: \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\") " pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.790649 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-config\") pod \"dnsmasq-dns-6c7fbc4849-8ffsc\" (UID: \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\") " pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.791161 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c7fbc4849-8ffsc\" (UID: \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\") " pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.791656 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-dns-svc\") pod \"dnsmasq-dns-6c7fbc4849-8ffsc\" (UID: \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\") " pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.828149 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppgrd\" (UniqueName: \"kubernetes.io/projected/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-kube-api-access-ppgrd\") pod \"dnsmasq-dns-6c7fbc4849-8ffsc\" (UID: \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\") " pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.857377 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.891199 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqkh4\" (UniqueName: \"kubernetes.io/projected/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-kube-api-access-lqkh4\") pod \"dnsmasq-dns-5ddc467bbc-h5rtm\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.891277 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-dns-svc\") pod \"dnsmasq-dns-5ddc467bbc-h5rtm\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.891427 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-config\") pod \"dnsmasq-dns-5ddc467bbc-h5rtm\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.891463 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddc467bbc-h5rtm\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.891481 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddc467bbc-h5rtm\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.892595 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-config\") pod \"dnsmasq-dns-5ddc467bbc-h5rtm\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.892830 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddc467bbc-h5rtm\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.893391 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-dns-svc\") pod \"dnsmasq-dns-5ddc467bbc-h5rtm\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.893792 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddc467bbc-h5rtm\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.904085 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqkh4\" (UniqueName: \"kubernetes.io/projected/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-kube-api-access-lqkh4\") pod \"dnsmasq-dns-5ddc467bbc-h5rtm\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 06:59:21 crc kubenswrapper[4786]: I1002 06:59:21.954318 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 06:59:22 crc kubenswrapper[4786]: I1002 06:59:22.254786 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c7fbc4849-8ffsc"] Oct 02 06:59:22 crc kubenswrapper[4786]: W1002 06:59:22.262250 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3bf385b_a0e8_4687_9bf5_5a76b1ece6e8.slice/crio-27b35001435fe94deb2e5b5377c357a0f71e6fe677e1b8893e68b6c992c18f87 WatchSource:0}: Error finding container 27b35001435fe94deb2e5b5377c357a0f71e6fe677e1b8893e68b6c992c18f87: Status 404 returned error can't find the container with id 27b35001435fe94deb2e5b5377c357a0f71e6fe677e1b8893e68b6c992c18f87 Oct 02 06:59:22 crc kubenswrapper[4786]: I1002 06:59:22.306725 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" event={"ID":"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8","Type":"ContainerStarted","Data":"27b35001435fe94deb2e5b5377c357a0f71e6fe677e1b8893e68b6c992c18f87"} Oct 02 06:59:22 crc kubenswrapper[4786]: I1002 06:59:22.354229 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ddc467bbc-h5rtm"] Oct 02 06:59:22 crc kubenswrapper[4786]: W1002 06:59:22.362647 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2766cdbc_f285_4d59_8fbf_bde633d4e4f8.slice/crio-408aa7bf520715ad04135903b779a8397160cf33482ecc40fb543f41f37f9c04 WatchSource:0}: Error finding container 408aa7bf520715ad04135903b779a8397160cf33482ecc40fb543f41f37f9c04: Status 404 returned error can't find the container with id 408aa7bf520715ad04135903b779a8397160cf33482ecc40fb543f41f37f9c04 Oct 02 06:59:23 crc kubenswrapper[4786]: I1002 06:59:23.315365 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" event={"ID":"2766cdbc-f285-4d59-8fbf-bde633d4e4f8","Type":"ContainerStarted","Data":"408aa7bf520715ad04135903b779a8397160cf33482ecc40fb543f41f37f9c04"} Oct 02 06:59:25 crc kubenswrapper[4786]: I1002 06:59:25.923203 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p7lsm"] Oct 02 06:59:25 crc kubenswrapper[4786]: I1002 06:59:25.926611 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7lsm" Oct 02 06:59:25 crc kubenswrapper[4786]: I1002 06:59:25.949809 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p7lsm"] Oct 02 06:59:26 crc kubenswrapper[4786]: I1002 06:59:26.053033 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wsdn\" (UniqueName: \"kubernetes.io/projected/51562e5e-7976-4c17-9db8-844c4c20fa0e-kube-api-access-8wsdn\") pod \"redhat-operators-p7lsm\" (UID: \"51562e5e-7976-4c17-9db8-844c4c20fa0e\") " pod="openshift-marketplace/redhat-operators-p7lsm" Oct 02 06:59:26 crc kubenswrapper[4786]: I1002 06:59:26.053246 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51562e5e-7976-4c17-9db8-844c4c20fa0e-catalog-content\") pod \"redhat-operators-p7lsm\" (UID: \"51562e5e-7976-4c17-9db8-844c4c20fa0e\") " pod="openshift-marketplace/redhat-operators-p7lsm" Oct 02 06:59:26 crc kubenswrapper[4786]: I1002 06:59:26.053394 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51562e5e-7976-4c17-9db8-844c4c20fa0e-utilities\") pod \"redhat-operators-p7lsm\" (UID: \"51562e5e-7976-4c17-9db8-844c4c20fa0e\") " pod="openshift-marketplace/redhat-operators-p7lsm" Oct 02 06:59:26 crc kubenswrapper[4786]: I1002 06:59:26.155721 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51562e5e-7976-4c17-9db8-844c4c20fa0e-utilities\") pod \"redhat-operators-p7lsm\" (UID: \"51562e5e-7976-4c17-9db8-844c4c20fa0e\") " pod="openshift-marketplace/redhat-operators-p7lsm" Oct 02 06:59:26 crc kubenswrapper[4786]: I1002 06:59:26.155882 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wsdn\" (UniqueName: \"kubernetes.io/projected/51562e5e-7976-4c17-9db8-844c4c20fa0e-kube-api-access-8wsdn\") pod \"redhat-operators-p7lsm\" (UID: \"51562e5e-7976-4c17-9db8-844c4c20fa0e\") " pod="openshift-marketplace/redhat-operators-p7lsm" Oct 02 06:59:26 crc kubenswrapper[4786]: I1002 06:59:26.155992 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51562e5e-7976-4c17-9db8-844c4c20fa0e-catalog-content\") pod \"redhat-operators-p7lsm\" (UID: \"51562e5e-7976-4c17-9db8-844c4c20fa0e\") " pod="openshift-marketplace/redhat-operators-p7lsm" Oct 02 06:59:26 crc kubenswrapper[4786]: I1002 06:59:26.156211 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51562e5e-7976-4c17-9db8-844c4c20fa0e-utilities\") pod \"redhat-operators-p7lsm\" (UID: \"51562e5e-7976-4c17-9db8-844c4c20fa0e\") " pod="openshift-marketplace/redhat-operators-p7lsm" Oct 02 06:59:26 crc kubenswrapper[4786]: I1002 06:59:26.156329 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51562e5e-7976-4c17-9db8-844c4c20fa0e-catalog-content\") pod \"redhat-operators-p7lsm\" (UID: \"51562e5e-7976-4c17-9db8-844c4c20fa0e\") " pod="openshift-marketplace/redhat-operators-p7lsm" Oct 02 06:59:26 crc kubenswrapper[4786]: I1002 06:59:26.174787 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wsdn\" (UniqueName: \"kubernetes.io/projected/51562e5e-7976-4c17-9db8-844c4c20fa0e-kube-api-access-8wsdn\") pod \"redhat-operators-p7lsm\" (UID: \"51562e5e-7976-4c17-9db8-844c4c20fa0e\") " pod="openshift-marketplace/redhat-operators-p7lsm" Oct 02 06:59:26 crc kubenswrapper[4786]: I1002 06:59:26.254817 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7lsm" Oct 02 06:59:26 crc kubenswrapper[4786]: I1002 06:59:26.665736 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p7lsm"] Oct 02 06:59:26 crc kubenswrapper[4786]: W1002 06:59:26.672506 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51562e5e_7976_4c17_9db8_844c4c20fa0e.slice/crio-6d12db4621c1efa3c07c27485339bce7712676cd71a1f7b8cdce57e799378c7a WatchSource:0}: Error finding container 6d12db4621c1efa3c07c27485339bce7712676cd71a1f7b8cdce57e799378c7a: Status 404 returned error can't find the container with id 6d12db4621c1efa3c07c27485339bce7712676cd71a1f7b8cdce57e799378c7a Oct 02 06:59:27 crc kubenswrapper[4786]: I1002 06:59:27.342753 4786 generic.go:334] "Generic (PLEG): container finished" podID="51562e5e-7976-4c17-9db8-844c4c20fa0e" containerID="54c8c504d77be35860ad3382c03999760b03e01f310f33897e73ce2ba1932adb" exitCode=0 Oct 02 06:59:27 crc kubenswrapper[4786]: I1002 06:59:27.342837 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7lsm" event={"ID":"51562e5e-7976-4c17-9db8-844c4c20fa0e","Type":"ContainerDied","Data":"54c8c504d77be35860ad3382c03999760b03e01f310f33897e73ce2ba1932adb"} Oct 02 06:59:27 crc kubenswrapper[4786]: I1002 06:59:27.342983 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7lsm" event={"ID":"51562e5e-7976-4c17-9db8-844c4c20fa0e","Type":"ContainerStarted","Data":"6d12db4621c1efa3c07c27485339bce7712676cd71a1f7b8cdce57e799378c7a"} Oct 02 06:59:29 crc kubenswrapper[4786]: I1002 06:59:29.356463 4786 generic.go:334] "Generic (PLEG): container finished" podID="51562e5e-7976-4c17-9db8-844c4c20fa0e" containerID="c0c58353c09fb3743902b1796aa502f11f9268eeec11d8f322054be4b6a361f4" exitCode=0 Oct 02 06:59:29 crc kubenswrapper[4786]: I1002 06:59:29.356557 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7lsm" event={"ID":"51562e5e-7976-4c17-9db8-844c4c20fa0e","Type":"ContainerDied","Data":"c0c58353c09fb3743902b1796aa502f11f9268eeec11d8f322054be4b6a361f4"} Oct 02 06:59:30 crc kubenswrapper[4786]: I1002 06:59:30.376440 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7lsm" event={"ID":"51562e5e-7976-4c17-9db8-844c4c20fa0e","Type":"ContainerStarted","Data":"90102974102e260a1357c52f51d6f9188789286edffa4048ae82fc562107e009"} Oct 02 06:59:30 crc kubenswrapper[4786]: I1002 06:59:30.392251 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p7lsm" podStartSLOduration=2.8700052080000003 podStartE2EDuration="5.392238694s" podCreationTimestamp="2025-10-02 06:59:25 +0000 UTC" firstStartedPulling="2025-10-02 06:59:27.344323007 +0000 UTC m=+777.465506138" lastFinishedPulling="2025-10-02 06:59:29.866556493 +0000 UTC m=+779.987739624" observedRunningTime="2025-10-02 06:59:30.388581416 +0000 UTC m=+780.509764567" watchObservedRunningTime="2025-10-02 06:59:30.392238694 +0000 UTC m=+780.513421825" Oct 02 06:59:36 crc kubenswrapper[4786]: I1002 06:59:36.255631 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p7lsm" Oct 02 06:59:36 crc kubenswrapper[4786]: I1002 06:59:36.256006 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p7lsm" Oct 02 06:59:36 crc kubenswrapper[4786]: I1002 06:59:36.287675 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p7lsm" Oct 02 06:59:36 crc kubenswrapper[4786]: I1002 06:59:36.451870 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p7lsm" Oct 02 06:59:36 crc kubenswrapper[4786]: I1002 06:59:36.511363 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p7lsm"] Oct 02 06:59:38 crc kubenswrapper[4786]: I1002 06:59:38.432707 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p7lsm" podUID="51562e5e-7976-4c17-9db8-844c4c20fa0e" containerName="registry-server" containerID="cri-o://90102974102e260a1357c52f51d6f9188789286edffa4048ae82fc562107e009" gracePeriod=2 Oct 02 06:59:39 crc kubenswrapper[4786]: I1002 06:59:39.441212 4786 generic.go:334] "Generic (PLEG): container finished" podID="51562e5e-7976-4c17-9db8-844c4c20fa0e" containerID="90102974102e260a1357c52f51d6f9188789286edffa4048ae82fc562107e009" exitCode=0 Oct 02 06:59:39 crc kubenswrapper[4786]: I1002 06:59:39.441250 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7lsm" event={"ID":"51562e5e-7976-4c17-9db8-844c4c20fa0e","Type":"ContainerDied","Data":"90102974102e260a1357c52f51d6f9188789286edffa4048ae82fc562107e009"} Oct 02 06:59:43 crc kubenswrapper[4786]: I1002 06:59:43.855476 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7lsm" Oct 02 06:59:43 crc kubenswrapper[4786]: I1002 06:59:43.942101 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51562e5e-7976-4c17-9db8-844c4c20fa0e-utilities\") pod \"51562e5e-7976-4c17-9db8-844c4c20fa0e\" (UID: \"51562e5e-7976-4c17-9db8-844c4c20fa0e\") " Oct 02 06:59:43 crc kubenswrapper[4786]: I1002 06:59:43.942630 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51562e5e-7976-4c17-9db8-844c4c20fa0e-catalog-content\") pod \"51562e5e-7976-4c17-9db8-844c4c20fa0e\" (UID: \"51562e5e-7976-4c17-9db8-844c4c20fa0e\") " Oct 02 06:59:43 crc kubenswrapper[4786]: I1002 06:59:43.942723 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wsdn\" (UniqueName: \"kubernetes.io/projected/51562e5e-7976-4c17-9db8-844c4c20fa0e-kube-api-access-8wsdn\") pod \"51562e5e-7976-4c17-9db8-844c4c20fa0e\" (UID: \"51562e5e-7976-4c17-9db8-844c4c20fa0e\") " Oct 02 06:59:43 crc kubenswrapper[4786]: I1002 06:59:43.943357 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51562e5e-7976-4c17-9db8-844c4c20fa0e-utilities" (OuterVolumeSpecName: "utilities") pod "51562e5e-7976-4c17-9db8-844c4c20fa0e" (UID: "51562e5e-7976-4c17-9db8-844c4c20fa0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:59:43 crc kubenswrapper[4786]: I1002 06:59:43.946704 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51562e5e-7976-4c17-9db8-844c4c20fa0e-kube-api-access-8wsdn" (OuterVolumeSpecName: "kube-api-access-8wsdn") pod "51562e5e-7976-4c17-9db8-844c4c20fa0e" (UID: "51562e5e-7976-4c17-9db8-844c4c20fa0e"). InnerVolumeSpecName "kube-api-access-8wsdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 06:59:44 crc kubenswrapper[4786]: I1002 06:59:44.001552 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51562e5e-7976-4c17-9db8-844c4c20fa0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51562e5e-7976-4c17-9db8-844c4c20fa0e" (UID: "51562e5e-7976-4c17-9db8-844c4c20fa0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 06:59:44 crc kubenswrapper[4786]: I1002 06:59:44.044781 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51562e5e-7976-4c17-9db8-844c4c20fa0e-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 06:59:44 crc kubenswrapper[4786]: I1002 06:59:44.044808 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51562e5e-7976-4c17-9db8-844c4c20fa0e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 06:59:44 crc kubenswrapper[4786]: I1002 06:59:44.044818 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wsdn\" (UniqueName: \"kubernetes.io/projected/51562e5e-7976-4c17-9db8-844c4c20fa0e-kube-api-access-8wsdn\") on node \"crc\" DevicePath \"\"" Oct 02 06:59:44 crc kubenswrapper[4786]: I1002 06:59:44.467858 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7lsm" event={"ID":"51562e5e-7976-4c17-9db8-844c4c20fa0e","Type":"ContainerDied","Data":"6d12db4621c1efa3c07c27485339bce7712676cd71a1f7b8cdce57e799378c7a"} Oct 02 06:59:44 crc kubenswrapper[4786]: I1002 06:59:44.467906 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7lsm" Oct 02 06:59:44 crc kubenswrapper[4786]: I1002 06:59:44.468017 4786 scope.go:117] "RemoveContainer" containerID="90102974102e260a1357c52f51d6f9188789286edffa4048ae82fc562107e009" Oct 02 06:59:44 crc kubenswrapper[4786]: I1002 06:59:44.483041 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p7lsm"] Oct 02 06:59:44 crc kubenswrapper[4786]: I1002 06:59:44.488033 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p7lsm"] Oct 02 06:59:44 crc kubenswrapper[4786]: I1002 06:59:44.493897 4786 scope.go:117] "RemoveContainer" containerID="c0c58353c09fb3743902b1796aa502f11f9268eeec11d8f322054be4b6a361f4" Oct 02 06:59:44 crc kubenswrapper[4786]: I1002 06:59:44.516643 4786 scope.go:117] "RemoveContainer" containerID="54c8c504d77be35860ad3382c03999760b03e01f310f33897e73ce2ba1932adb" Oct 02 06:59:46 crc kubenswrapper[4786]: I1002 06:59:46.186545 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51562e5e-7976-4c17-9db8-844c4c20fa0e" path="/var/lib/kubelet/pods/51562e5e-7976-4c17-9db8-844c4c20fa0e/volumes" Oct 02 06:59:56 crc kubenswrapper[4786]: I1002 06:59:56.555563 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7a34253b-beed-468f-8bad-82366a5eb5c3","Type":"ContainerStarted","Data":"38e87a1f59faf04a43c609aa662d827fd81aad0a9d27e60d93ad5dfece74d190"} Oct 02 07:00:00 crc kubenswrapper[4786]: I1002 07:00:00.151371 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323140-q98jg"] Oct 02 07:00:00 crc kubenswrapper[4786]: E1002 07:00:00.152344 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51562e5e-7976-4c17-9db8-844c4c20fa0e" containerName="registry-server" Oct 02 07:00:00 crc kubenswrapper[4786]: I1002 07:00:00.152360 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="51562e5e-7976-4c17-9db8-844c4c20fa0e" containerName="registry-server" Oct 02 07:00:00 crc kubenswrapper[4786]: E1002 07:00:00.152392 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51562e5e-7976-4c17-9db8-844c4c20fa0e" containerName="extract-utilities" Oct 02 07:00:00 crc kubenswrapper[4786]: I1002 07:00:00.152399 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="51562e5e-7976-4c17-9db8-844c4c20fa0e" containerName="extract-utilities" Oct 02 07:00:00 crc kubenswrapper[4786]: E1002 07:00:00.152450 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51562e5e-7976-4c17-9db8-844c4c20fa0e" containerName="extract-content" Oct 02 07:00:00 crc kubenswrapper[4786]: I1002 07:00:00.152457 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="51562e5e-7976-4c17-9db8-844c4c20fa0e" containerName="extract-content" Oct 02 07:00:00 crc kubenswrapper[4786]: I1002 07:00:00.153835 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="51562e5e-7976-4c17-9db8-844c4c20fa0e" containerName="registry-server" Oct 02 07:00:00 crc kubenswrapper[4786]: I1002 07:00:00.155346 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323140-q98jg"] Oct 02 07:00:00 crc kubenswrapper[4786]: I1002 07:00:00.155452 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323140-q98jg" Oct 02 07:00:00 crc kubenswrapper[4786]: I1002 07:00:00.158107 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 07:00:00 crc kubenswrapper[4786]: I1002 07:00:00.158312 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 07:00:00 crc kubenswrapper[4786]: I1002 07:00:00.282579 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b28mg\" (UniqueName: \"kubernetes.io/projected/6f417dee-ea81-42d6-8941-e93520864f2b-kube-api-access-b28mg\") pod \"collect-profiles-29323140-q98jg\" (UID: \"6f417dee-ea81-42d6-8941-e93520864f2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323140-q98jg" Oct 02 07:00:00 crc kubenswrapper[4786]: I1002 07:00:00.282685 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f417dee-ea81-42d6-8941-e93520864f2b-secret-volume\") pod \"collect-profiles-29323140-q98jg\" (UID: \"6f417dee-ea81-42d6-8941-e93520864f2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323140-q98jg" Oct 02 07:00:00 crc kubenswrapper[4786]: I1002 07:00:00.282925 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f417dee-ea81-42d6-8941-e93520864f2b-config-volume\") pod \"collect-profiles-29323140-q98jg\" (UID: \"6f417dee-ea81-42d6-8941-e93520864f2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323140-q98jg" Oct 02 07:00:00 crc kubenswrapper[4786]: I1002 07:00:00.384900 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b28mg\" (UniqueName: \"kubernetes.io/projected/6f417dee-ea81-42d6-8941-e93520864f2b-kube-api-access-b28mg\") pod \"collect-profiles-29323140-q98jg\" (UID: \"6f417dee-ea81-42d6-8941-e93520864f2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323140-q98jg" Oct 02 07:00:00 crc kubenswrapper[4786]: I1002 07:00:00.384989 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f417dee-ea81-42d6-8941-e93520864f2b-secret-volume\") pod \"collect-profiles-29323140-q98jg\" (UID: \"6f417dee-ea81-42d6-8941-e93520864f2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323140-q98jg" Oct 02 07:00:00 crc kubenswrapper[4786]: I1002 07:00:00.385014 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f417dee-ea81-42d6-8941-e93520864f2b-config-volume\") pod \"collect-profiles-29323140-q98jg\" (UID: \"6f417dee-ea81-42d6-8941-e93520864f2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323140-q98jg" Oct 02 07:00:00 crc kubenswrapper[4786]: I1002 07:00:00.385792 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f417dee-ea81-42d6-8941-e93520864f2b-config-volume\") pod \"collect-profiles-29323140-q98jg\" (UID: \"6f417dee-ea81-42d6-8941-e93520864f2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323140-q98jg" Oct 02 07:00:00 crc kubenswrapper[4786]: I1002 07:00:00.401996 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f417dee-ea81-42d6-8941-e93520864f2b-secret-volume\") pod \"collect-profiles-29323140-q98jg\" (UID: \"6f417dee-ea81-42d6-8941-e93520864f2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323140-q98jg" Oct 02 07:00:00 crc kubenswrapper[4786]: I1002 07:00:00.404057 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b28mg\" (UniqueName: \"kubernetes.io/projected/6f417dee-ea81-42d6-8941-e93520864f2b-kube-api-access-b28mg\") pod \"collect-profiles-29323140-q98jg\" (UID: \"6f417dee-ea81-42d6-8941-e93520864f2b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323140-q98jg" Oct 02 07:00:00 crc kubenswrapper[4786]: I1002 07:00:00.471232 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323140-q98jg" Oct 02 07:00:04 crc kubenswrapper[4786]: I1002 07:00:04.604079 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"26e71aaf-fc49-4218-b001-433de642f9ae","Type":"ContainerStarted","Data":"99492e0fc002759e9a91e3e8484594b042c7b172d8e01feddcb9817f77635e1e"} Oct 02 07:00:04 crc kubenswrapper[4786]: I1002 07:00:04.604507 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 02 07:00:04 crc kubenswrapper[4786]: I1002 07:00:04.605357 4786 generic.go:334] "Generic (PLEG): container finished" podID="2766cdbc-f285-4d59-8fbf-bde633d4e4f8" containerID="81b69da45d0750565259610267d64d14e4815cf53389fe2eed3f87361c09e081" exitCode=0 Oct 02 07:00:04 crc kubenswrapper[4786]: I1002 07:00:04.605378 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" event={"ID":"2766cdbc-f285-4d59-8fbf-bde633d4e4f8","Type":"ContainerDied","Data":"81b69da45d0750565259610267d64d14e4815cf53389fe2eed3f87361c09e081"} Oct 02 07:00:04 crc kubenswrapper[4786]: I1002 07:00:04.620876 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.453216063 podStartE2EDuration="55.620865999s" podCreationTimestamp="2025-10-02 06:59:09 +0000 UTC" firstStartedPulling="2025-10-02 06:59:10.194363588 +0000 UTC m=+760.315546720" lastFinishedPulling="2025-10-02 07:00:04.362013524 +0000 UTC m=+814.483196656" observedRunningTime="2025-10-02 07:00:04.617023729 +0000 UTC m=+814.738206870" watchObservedRunningTime="2025-10-02 07:00:04.620865999 +0000 UTC m=+814.742049130" Oct 02 07:00:04 crc kubenswrapper[4786]: I1002 07:00:04.739560 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323140-q98jg"] Oct 02 07:00:04 crc kubenswrapper[4786]: W1002 07:00:04.776979 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f417dee_ea81_42d6_8941_e93520864f2b.slice/crio-5054dc805bb715c1a5b580eff6440093273941badf99c5dcdfbade4ce9bec89f WatchSource:0}: Error finding container 5054dc805bb715c1a5b580eff6440093273941badf99c5dcdfbade4ce9bec89f: Status 404 returned error can't find the container with id 5054dc805bb715c1a5b580eff6440093273941badf99c5dcdfbade4ce9bec89f Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.611953 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" event={"ID":"2766cdbc-f285-4d59-8fbf-bde633d4e4f8","Type":"ContainerStarted","Data":"fafcba6321f43f09f3c3c130a192f7beabcf8530ac10c02395ed6ec7ee97d6f3"} Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.612904 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.614188 4786 generic.go:334] "Generic (PLEG): container finished" podID="388590eb-6c04-4f86-9676-11054d02eae7" containerID="99311f69b078776a4ca4ef245ee2d03bedb9b18bb593c2ab1999ca5e911b5b4e" exitCode=0 Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.614253 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5465cc9897-6w9ft" event={"ID":"388590eb-6c04-4f86-9676-11054d02eae7","Type":"ContainerDied","Data":"99311f69b078776a4ca4ef245ee2d03bedb9b18bb593c2ab1999ca5e911b5b4e"} Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.616001 4786 generic.go:334] "Generic (PLEG): container finished" podID="b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8" containerID="febdb2f533381eaa005e1ea4282f17305e5bf3168adb6d7f60a340fb982b80e7" exitCode=0 Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.616104 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" event={"ID":"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8","Type":"ContainerDied","Data":"febdb2f533381eaa005e1ea4282f17305e5bf3168adb6d7f60a340fb982b80e7"} Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.619514 4786 generic.go:334] "Generic (PLEG): container finished" podID="f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce" containerID="1f4ea204194d68cee3aab6ddfe27c62311fb6b592e752acfc87d1ab0d261953e" exitCode=0 Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.619565 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58cfb8dc65-t9s4x" event={"ID":"f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce","Type":"ContainerDied","Data":"1f4ea204194d68cee3aab6ddfe27c62311fb6b592e752acfc87d1ab0d261953e"} Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.621766 4786 generic.go:334] "Generic (PLEG): container finished" podID="6f417dee-ea81-42d6-8941-e93520864f2b" containerID="4bf46be3c7541796c23c1532f0077a0a14d1d6b71082102b2e0d78709f05ec1f" exitCode=0 Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.621820 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323140-q98jg" event={"ID":"6f417dee-ea81-42d6-8941-e93520864f2b","Type":"ContainerDied","Data":"4bf46be3c7541796c23c1532f0077a0a14d1d6b71082102b2e0d78709f05ec1f"} Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.621837 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323140-q98jg" event={"ID":"6f417dee-ea81-42d6-8941-e93520864f2b","Type":"ContainerStarted","Data":"5054dc805bb715c1a5b580eff6440093273941badf99c5dcdfbade4ce9bec89f"} Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.623261 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gl77n" event={"ID":"d0b71af7-f7f4-45e8-b8f0-c7428f54a37d","Type":"ContainerStarted","Data":"7e939886ed0381e14df5ae656e72ef38f232668d86104905c33925921a872d9d"} Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.623364 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-gl77n" Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.631705 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" podStartSLOduration=2.601065572 podStartE2EDuration="44.631679527s" podCreationTimestamp="2025-10-02 06:59:21 +0000 UTC" firstStartedPulling="2025-10-02 06:59:22.365264522 +0000 UTC m=+772.486447653" lastFinishedPulling="2025-10-02 07:00:04.395878476 +0000 UTC m=+814.517061608" observedRunningTime="2025-10-02 07:00:05.626311714 +0000 UTC m=+815.747494846" watchObservedRunningTime="2025-10-02 07:00:05.631679527 +0000 UTC m=+815.752862658" Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.693278 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-gl77n" podStartSLOduration=2.112605193 podStartE2EDuration="49.693264033s" podCreationTimestamp="2025-10-02 06:59:16 +0000 UTC" firstStartedPulling="2025-10-02 06:59:16.862355401 +0000 UTC m=+766.983538533" lastFinishedPulling="2025-10-02 07:00:04.443014242 +0000 UTC m=+814.564197373" observedRunningTime="2025-10-02 07:00:05.690916721 +0000 UTC m=+815.812099862" watchObservedRunningTime="2025-10-02 07:00:05.693264033 +0000 UTC m=+815.814447165" Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.830600 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5465cc9897-6w9ft" Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.872005 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/388590eb-6c04-4f86-9676-11054d02eae7-dns-svc\") pod \"388590eb-6c04-4f86-9676-11054d02eae7\" (UID: \"388590eb-6c04-4f86-9676-11054d02eae7\") " Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.872139 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/388590eb-6c04-4f86-9676-11054d02eae7-config\") pod \"388590eb-6c04-4f86-9676-11054d02eae7\" (UID: \"388590eb-6c04-4f86-9676-11054d02eae7\") " Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.872192 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84zsg\" (UniqueName: \"kubernetes.io/projected/388590eb-6c04-4f86-9676-11054d02eae7-kube-api-access-84zsg\") pod \"388590eb-6c04-4f86-9676-11054d02eae7\" (UID: \"388590eb-6c04-4f86-9676-11054d02eae7\") " Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.875537 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388590eb-6c04-4f86-9676-11054d02eae7-kube-api-access-84zsg" (OuterVolumeSpecName: "kube-api-access-84zsg") pod "388590eb-6c04-4f86-9676-11054d02eae7" (UID: "388590eb-6c04-4f86-9676-11054d02eae7"). InnerVolumeSpecName "kube-api-access-84zsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.888959 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/388590eb-6c04-4f86-9676-11054d02eae7-config" (OuterVolumeSpecName: "config") pod "388590eb-6c04-4f86-9676-11054d02eae7" (UID: "388590eb-6c04-4f86-9676-11054d02eae7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.888971 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/388590eb-6c04-4f86-9676-11054d02eae7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "388590eb-6c04-4f86-9676-11054d02eae7" (UID: "388590eb-6c04-4f86-9676-11054d02eae7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.891753 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58cfb8dc65-t9s4x" Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.973526 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce-config\") pod \"f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce\" (UID: \"f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce\") " Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.973670 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74rtl\" (UniqueName: \"kubernetes.io/projected/f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce-kube-api-access-74rtl\") pod \"f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce\" (UID: \"f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce\") " Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.973723 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce-dns-svc\") pod \"f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce\" (UID: \"f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce\") " Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.974276 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/388590eb-6c04-4f86-9676-11054d02eae7-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.974293 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84zsg\" (UniqueName: \"kubernetes.io/projected/388590eb-6c04-4f86-9676-11054d02eae7-kube-api-access-84zsg\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.974304 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/388590eb-6c04-4f86-9676-11054d02eae7-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.975988 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce-kube-api-access-74rtl" (OuterVolumeSpecName: "kube-api-access-74rtl") pod "f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce" (UID: "f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce"). InnerVolumeSpecName "kube-api-access-74rtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.986999 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce-config" (OuterVolumeSpecName: "config") pod "f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce" (UID: "f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:05 crc kubenswrapper[4786]: I1002 07:00:05.988993 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce" (UID: "f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:06 crc kubenswrapper[4786]: I1002 07:00:06.075829 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:06 crc kubenswrapper[4786]: I1002 07:00:06.076057 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74rtl\" (UniqueName: \"kubernetes.io/projected/f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce-kube-api-access-74rtl\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:06 crc kubenswrapper[4786]: I1002 07:00:06.076070 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:06 crc kubenswrapper[4786]: I1002 07:00:06.659303 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6","Type":"ContainerStarted","Data":"c40c38a5e1d499701698adc2a3ed2ab5d4cbaf028b9e5f7978f072f231f41d39"} Oct 02 07:00:06 crc kubenswrapper[4786]: I1002 07:00:06.703861 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5465cc9897-6w9ft" event={"ID":"388590eb-6c04-4f86-9676-11054d02eae7","Type":"ContainerDied","Data":"4ca4fe9f4118797ad1af909c2b876c240b3302826a8345be7f2a5501268ed38e"} Oct 02 07:00:06 crc kubenswrapper[4786]: I1002 07:00:06.704081 4786 scope.go:117] "RemoveContainer" containerID="99311f69b078776a4ca4ef245ee2d03bedb9b18bb593c2ab1999ca5e911b5b4e" Oct 02 07:00:06 crc kubenswrapper[4786]: I1002 07:00:06.704182 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5465cc9897-6w9ft" Oct 02 07:00:06 crc kubenswrapper[4786]: I1002 07:00:06.711731 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" event={"ID":"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8","Type":"ContainerStarted","Data":"ed1c78d6a55786bfe663fcf99b9b4c7ad3e4014b1ecb21e189b03627de6dbb28"} Oct 02 07:00:06 crc kubenswrapper[4786]: I1002 07:00:06.711915 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" Oct 02 07:00:06 crc kubenswrapper[4786]: I1002 07:00:06.715307 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58cfb8dc65-t9s4x" event={"ID":"f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce","Type":"ContainerDied","Data":"212dbbeea78ffe08985183a8740c74fe67f10721663f8e24a08da4b7b89e3d9a"} Oct 02 07:00:06 crc kubenswrapper[4786]: I1002 07:00:06.715312 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58cfb8dc65-t9s4x" Oct 02 07:00:06 crc kubenswrapper[4786]: I1002 07:00:06.740497 4786 scope.go:117] "RemoveContainer" containerID="1f4ea204194d68cee3aab6ddfe27c62311fb6b592e752acfc87d1ab0d261953e" Oct 02 07:00:06 crc kubenswrapper[4786]: I1002 07:00:06.741846 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5465cc9897-6w9ft"] Oct 02 07:00:06 crc kubenswrapper[4786]: I1002 07:00:06.756018 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5465cc9897-6w9ft"] Oct 02 07:00:06 crc kubenswrapper[4786]: I1002 07:00:06.774308 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58cfb8dc65-t9s4x"] Oct 02 07:00:06 crc kubenswrapper[4786]: I1002 07:00:06.799602 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58cfb8dc65-t9s4x"] Oct 02 07:00:06 crc kubenswrapper[4786]: I1002 07:00:06.804316 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" podStartSLOduration=3.6167229499999998 podStartE2EDuration="45.804298731s" podCreationTimestamp="2025-10-02 06:59:21 +0000 UTC" firstStartedPulling="2025-10-02 06:59:22.264133863 +0000 UTC m=+772.385316993" lastFinishedPulling="2025-10-02 07:00:04.451709644 +0000 UTC m=+814.572892774" observedRunningTime="2025-10-02 07:00:06.794796689 +0000 UTC m=+816.915979840" watchObservedRunningTime="2025-10-02 07:00:06.804298731 +0000 UTC m=+816.925481852" Oct 02 07:00:06 crc kubenswrapper[4786]: I1002 07:00:06.978283 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323140-q98jg" Oct 02 07:00:07 crc kubenswrapper[4786]: I1002 07:00:07.093308 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b28mg\" (UniqueName: \"kubernetes.io/projected/6f417dee-ea81-42d6-8941-e93520864f2b-kube-api-access-b28mg\") pod \"6f417dee-ea81-42d6-8941-e93520864f2b\" (UID: \"6f417dee-ea81-42d6-8941-e93520864f2b\") " Oct 02 07:00:07 crc kubenswrapper[4786]: I1002 07:00:07.093427 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f417dee-ea81-42d6-8941-e93520864f2b-secret-volume\") pod \"6f417dee-ea81-42d6-8941-e93520864f2b\" (UID: \"6f417dee-ea81-42d6-8941-e93520864f2b\") " Oct 02 07:00:07 crc kubenswrapper[4786]: I1002 07:00:07.093538 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f417dee-ea81-42d6-8941-e93520864f2b-config-volume\") pod \"6f417dee-ea81-42d6-8941-e93520864f2b\" (UID: \"6f417dee-ea81-42d6-8941-e93520864f2b\") " Oct 02 07:00:07 crc kubenswrapper[4786]: I1002 07:00:07.094060 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f417dee-ea81-42d6-8941-e93520864f2b-config-volume" (OuterVolumeSpecName: "config-volume") pod "6f417dee-ea81-42d6-8941-e93520864f2b" (UID: "6f417dee-ea81-42d6-8941-e93520864f2b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:07 crc kubenswrapper[4786]: I1002 07:00:07.096906 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f417dee-ea81-42d6-8941-e93520864f2b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6f417dee-ea81-42d6-8941-e93520864f2b" (UID: "6f417dee-ea81-42d6-8941-e93520864f2b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:00:07 crc kubenswrapper[4786]: I1002 07:00:07.096962 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f417dee-ea81-42d6-8941-e93520864f2b-kube-api-access-b28mg" (OuterVolumeSpecName: "kube-api-access-b28mg") pod "6f417dee-ea81-42d6-8941-e93520864f2b" (UID: "6f417dee-ea81-42d6-8941-e93520864f2b"). InnerVolumeSpecName "kube-api-access-b28mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:00:07 crc kubenswrapper[4786]: I1002 07:00:07.195515 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f417dee-ea81-42d6-8941-e93520864f2b-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:07 crc kubenswrapper[4786]: I1002 07:00:07.195541 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b28mg\" (UniqueName: \"kubernetes.io/projected/6f417dee-ea81-42d6-8941-e93520864f2b-kube-api-access-b28mg\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:07 crc kubenswrapper[4786]: I1002 07:00:07.195571 4786 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6f417dee-ea81-42d6-8941-e93520864f2b-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:07 crc kubenswrapper[4786]: I1002 07:00:07.722916 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323140-q98jg" event={"ID":"6f417dee-ea81-42d6-8941-e93520864f2b","Type":"ContainerDied","Data":"5054dc805bb715c1a5b580eff6440093273941badf99c5dcdfbade4ce9bec89f"} Oct 02 07:00:07 crc kubenswrapper[4786]: I1002 07:00:07.722948 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5054dc805bb715c1a5b580eff6440093273941badf99c5dcdfbade4ce9bec89f" Oct 02 07:00:07 crc kubenswrapper[4786]: I1002 07:00:07.722991 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323140-q98jg" Oct 02 07:00:07 crc kubenswrapper[4786]: I1002 07:00:07.725736 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9edb2cb3-f47c-4f56-8181-1ec8f9d774f6","Type":"ContainerStarted","Data":"a144dd7fa92f2b0b1a1e00b9dedc9ce484ca8c0e986ed3f4f69789c99803505b"} Oct 02 07:00:07 crc kubenswrapper[4786]: I1002 07:00:07.745183 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.117109499 podStartE2EDuration="50.745169636s" podCreationTimestamp="2025-10-02 06:59:17 +0000 UTC" firstStartedPulling="2025-10-02 06:59:20.842207652 +0000 UTC m=+770.963390783" lastFinishedPulling="2025-10-02 07:00:06.470267799 +0000 UTC m=+816.591450920" observedRunningTime="2025-10-02 07:00:07.742001557 +0000 UTC m=+817.863184698" watchObservedRunningTime="2025-10-02 07:00:07.745169636 +0000 UTC m=+817.866352767" Oct 02 07:00:08 crc kubenswrapper[4786]: I1002 07:00:08.186272 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="388590eb-6c04-4f86-9676-11054d02eae7" path="/var/lib/kubelet/pods/388590eb-6c04-4f86-9676-11054d02eae7/volumes" Oct 02 07:00:08 crc kubenswrapper[4786]: I1002 07:00:08.187064 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce" path="/var/lib/kubelet/pods/f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce/volumes" Oct 02 07:00:08 crc kubenswrapper[4786]: I1002 07:00:08.733093 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85dddbfdbf-85pgc" event={"ID":"4e99c2f0-d917-400c-9fb9-48abce2f4854","Type":"ContainerStarted","Data":"d64a4d88002751eb92da7a49f380fad3845a82e5a14fe0633bf9e94cc9bfd355"} Oct 02 07:00:08 crc kubenswrapper[4786]: I1002 07:00:08.958499 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85dddbfdbf-85pgc" Oct 02 07:00:09 crc kubenswrapper[4786]: I1002 07:00:09.119714 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e99c2f0-d917-400c-9fb9-48abce2f4854-config\") pod \"4e99c2f0-d917-400c-9fb9-48abce2f4854\" (UID: \"4e99c2f0-d917-400c-9fb9-48abce2f4854\") " Oct 02 07:00:09 crc kubenswrapper[4786]: I1002 07:00:09.119758 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrk9f\" (UniqueName: \"kubernetes.io/projected/4e99c2f0-d917-400c-9fb9-48abce2f4854-kube-api-access-mrk9f\") pod \"4e99c2f0-d917-400c-9fb9-48abce2f4854\" (UID: \"4e99c2f0-d917-400c-9fb9-48abce2f4854\") " Oct 02 07:00:09 crc kubenswrapper[4786]: I1002 07:00:09.122791 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e99c2f0-d917-400c-9fb9-48abce2f4854-kube-api-access-mrk9f" (OuterVolumeSpecName: "kube-api-access-mrk9f") pod "4e99c2f0-d917-400c-9fb9-48abce2f4854" (UID: "4e99c2f0-d917-400c-9fb9-48abce2f4854"). InnerVolumeSpecName "kube-api-access-mrk9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:00:09 crc kubenswrapper[4786]: I1002 07:00:09.132529 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e99c2f0-d917-400c-9fb9-48abce2f4854-config" (OuterVolumeSpecName: "config") pod "4e99c2f0-d917-400c-9fb9-48abce2f4854" (UID: "4e99c2f0-d917-400c-9fb9-48abce2f4854"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:09 crc kubenswrapper[4786]: I1002 07:00:09.221317 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e99c2f0-d917-400c-9fb9-48abce2f4854-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:09 crc kubenswrapper[4786]: I1002 07:00:09.221372 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrk9f\" (UniqueName: \"kubernetes.io/projected/4e99c2f0-d917-400c-9fb9-48abce2f4854-kube-api-access-mrk9f\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:09 crc kubenswrapper[4786]: I1002 07:00:09.462627 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 02 07:00:09 crc kubenswrapper[4786]: I1002 07:00:09.740122 4786 generic.go:334] "Generic (PLEG): container finished" podID="4e99c2f0-d917-400c-9fb9-48abce2f4854" containerID="d64a4d88002751eb92da7a49f380fad3845a82e5a14fe0633bf9e94cc9bfd355" exitCode=0 Oct 02 07:00:09 crc kubenswrapper[4786]: I1002 07:00:09.740158 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85dddbfdbf-85pgc" Oct 02 07:00:09 crc kubenswrapper[4786]: I1002 07:00:09.740165 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85dddbfdbf-85pgc" event={"ID":"4e99c2f0-d917-400c-9fb9-48abce2f4854","Type":"ContainerDied","Data":"d64a4d88002751eb92da7a49f380fad3845a82e5a14fe0633bf9e94cc9bfd355"} Oct 02 07:00:09 crc kubenswrapper[4786]: I1002 07:00:09.740460 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85dddbfdbf-85pgc" event={"ID":"4e99c2f0-d917-400c-9fb9-48abce2f4854","Type":"ContainerDied","Data":"6239854b1746bf971777b1c7b66626fceb540abaa5551e541415212cb06b4279"} Oct 02 07:00:09 crc kubenswrapper[4786]: I1002 07:00:09.740507 4786 scope.go:117] "RemoveContainer" containerID="d64a4d88002751eb92da7a49f380fad3845a82e5a14fe0633bf9e94cc9bfd355" Oct 02 07:00:09 crc kubenswrapper[4786]: I1002 07:00:09.754922 4786 scope.go:117] "RemoveContainer" containerID="d64a4d88002751eb92da7a49f380fad3845a82e5a14fe0633bf9e94cc9bfd355" Oct 02 07:00:09 crc kubenswrapper[4786]: E1002 07:00:09.755236 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d64a4d88002751eb92da7a49f380fad3845a82e5a14fe0633bf9e94cc9bfd355\": container with ID starting with d64a4d88002751eb92da7a49f380fad3845a82e5a14fe0633bf9e94cc9bfd355 not found: ID does not exist" containerID="d64a4d88002751eb92da7a49f380fad3845a82e5a14fe0633bf9e94cc9bfd355" Oct 02 07:00:09 crc kubenswrapper[4786]: I1002 07:00:09.755271 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d64a4d88002751eb92da7a49f380fad3845a82e5a14fe0633bf9e94cc9bfd355"} err="failed to get container status \"d64a4d88002751eb92da7a49f380fad3845a82e5a14fe0633bf9e94cc9bfd355\": rpc error: code = NotFound desc = could not find container \"d64a4d88002751eb92da7a49f380fad3845a82e5a14fe0633bf9e94cc9bfd355\": container with ID starting with d64a4d88002751eb92da7a49f380fad3845a82e5a14fe0633bf9e94cc9bfd355 not found: ID does not exist" Oct 02 07:00:09 crc kubenswrapper[4786]: I1002 07:00:09.792901 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85dddbfdbf-85pgc"] Oct 02 07:00:09 crc kubenswrapper[4786]: I1002 07:00:09.798430 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85dddbfdbf-85pgc"] Oct 02 07:00:09 crc kubenswrapper[4786]: I1002 07:00:09.800426 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 02 07:00:10 crc kubenswrapper[4786]: I1002 07:00:10.186476 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e99c2f0-d917-400c-9fb9-48abce2f4854" path="/var/lib/kubelet/pods/4e99c2f0-d917-400c-9fb9-48abce2f4854/volumes" Oct 02 07:00:10 crc kubenswrapper[4786]: I1002 07:00:10.461905 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 02 07:00:10 crc kubenswrapper[4786]: I1002 07:00:10.496233 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.297591 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ddc467bbc-h5rtm"] Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.298047 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" podUID="2766cdbc-f285-4d59-8fbf-bde633d4e4f8" containerName="dnsmasq-dns" containerID="cri-o://fafcba6321f43f09f3c3c130a192f7beabcf8530ac10c02395ed6ec7ee97d6f3" gracePeriod=10 Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.298854 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.341454 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78586fdbff-kkx7p"] Oct 02 07:00:11 crc kubenswrapper[4786]: E1002 07:00:11.341766 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f417dee-ea81-42d6-8941-e93520864f2b" containerName="collect-profiles" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.341783 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f417dee-ea81-42d6-8941-e93520864f2b" containerName="collect-profiles" Oct 02 07:00:11 crc kubenswrapper[4786]: E1002 07:00:11.341795 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e99c2f0-d917-400c-9fb9-48abce2f4854" containerName="init" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.341801 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e99c2f0-d917-400c-9fb9-48abce2f4854" containerName="init" Oct 02 07:00:11 crc kubenswrapper[4786]: E1002 07:00:11.341811 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388590eb-6c04-4f86-9676-11054d02eae7" containerName="init" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.341816 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="388590eb-6c04-4f86-9676-11054d02eae7" containerName="init" Oct 02 07:00:11 crc kubenswrapper[4786]: E1002 07:00:11.341833 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce" containerName="init" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.341837 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce" containerName="init" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.342005 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e99c2f0-d917-400c-9fb9-48abce2f4854" containerName="init" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.342020 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f417dee-ea81-42d6-8941-e93520864f2b" containerName="collect-profiles" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.342028 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="388590eb-6c04-4f86-9676-11054d02eae7" containerName="init" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.342036 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4adf5b6-7f9d-4ef1-a946-153aacf9b0ce" containerName="init" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.342784 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.350235 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78586fdbff-kkx7p"] Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.458152 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7tjz\" (UniqueName: \"kubernetes.io/projected/b03cd4b9-a66a-4a03-9915-88fe79682756-kube-api-access-l7tjz\") pod \"dnsmasq-dns-78586fdbff-kkx7p\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.458210 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-config\") pod \"dnsmasq-dns-78586fdbff-kkx7p\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.458249 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-ovsdbserver-sb\") pod \"dnsmasq-dns-78586fdbff-kkx7p\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.458281 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-ovsdbserver-nb\") pod \"dnsmasq-dns-78586fdbff-kkx7p\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.458336 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-dns-svc\") pod \"dnsmasq-dns-78586fdbff-kkx7p\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.562164 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-config\") pod \"dnsmasq-dns-78586fdbff-kkx7p\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.562248 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-ovsdbserver-sb\") pod \"dnsmasq-dns-78586fdbff-kkx7p\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.562298 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-ovsdbserver-nb\") pod \"dnsmasq-dns-78586fdbff-kkx7p\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.562343 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-dns-svc\") pod \"dnsmasq-dns-78586fdbff-kkx7p\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.562415 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7tjz\" (UniqueName: \"kubernetes.io/projected/b03cd4b9-a66a-4a03-9915-88fe79682756-kube-api-access-l7tjz\") pod \"dnsmasq-dns-78586fdbff-kkx7p\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.563923 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-config\") pod \"dnsmasq-dns-78586fdbff-kkx7p\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.564018 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-ovsdbserver-nb\") pod \"dnsmasq-dns-78586fdbff-kkx7p\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.564070 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-ovsdbserver-sb\") pod \"dnsmasq-dns-78586fdbff-kkx7p\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.564466 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-dns-svc\") pod \"dnsmasq-dns-78586fdbff-kkx7p\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.576501 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7tjz\" (UniqueName: \"kubernetes.io/projected/b03cd4b9-a66a-4a03-9915-88fe79682756-kube-api-access-l7tjz\") pod \"dnsmasq-dns-78586fdbff-kkx7p\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.738976 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.757445 4786 generic.go:334] "Generic (PLEG): container finished" podID="2766cdbc-f285-4d59-8fbf-bde633d4e4f8" containerID="fafcba6321f43f09f3c3c130a192f7beabcf8530ac10c02395ed6ec7ee97d6f3" exitCode=0 Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.757537 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" event={"ID":"2766cdbc-f285-4d59-8fbf-bde633d4e4f8","Type":"ContainerDied","Data":"fafcba6321f43f09f3c3c130a192f7beabcf8530ac10c02395ed6ec7ee97d6f3"} Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.789537 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.858938 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" Oct 02 07:00:11 crc kubenswrapper[4786]: I1002 07:00:11.954776 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" podUID="2766cdbc-f285-4d59-8fbf-bde633d4e4f8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.108:5353: connect: connection refused" Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.434916 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.439335 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.441910 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.441936 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.442167 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.442835 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-8vp27" Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.443935 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.579538 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.579583 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-etc-swift\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.579639 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8f6s\" (UniqueName: \"kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-kube-api-access-w8f6s\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.579665 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-lock\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.580062 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-cache\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.682028 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8f6s\" (UniqueName: \"kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-kube-api-access-w8f6s\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.682073 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-lock\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.682124 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-cache\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.682188 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.682205 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-etc-swift\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:12 crc kubenswrapper[4786]: E1002 07:00:12.682362 4786 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 07:00:12 crc kubenswrapper[4786]: E1002 07:00:12.682381 4786 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 07:00:12 crc kubenswrapper[4786]: E1002 07:00:12.682435 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-etc-swift podName:15dfcc2f-b58a-4238-a555-e0e5e4a05e4b nodeName:}" failed. No retries permitted until 2025-10-02 07:00:13.182421394 +0000 UTC m=+823.303604524 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-etc-swift") pod "swift-storage-0" (UID: "15dfcc2f-b58a-4238-a555-e0e5e4a05e4b") : configmap "swift-ring-files" not found Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.682438 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/swift-storage-0" Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.682574 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-lock\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.682610 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-cache\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.696203 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8f6s\" (UniqueName: \"kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-kube-api-access-w8f6s\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:12 crc kubenswrapper[4786]: I1002 07:00:12.697496 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.169682 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.188166 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-etc-swift\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:13 crc kubenswrapper[4786]: E1002 07:00:13.188336 4786 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 07:00:13 crc kubenswrapper[4786]: E1002 07:00:13.188354 4786 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 07:00:13 crc kubenswrapper[4786]: E1002 07:00:13.188404 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-etc-swift podName:15dfcc2f-b58a-4238-a555-e0e5e4a05e4b nodeName:}" failed. No retries permitted until 2025-10-02 07:00:14.188378189 +0000 UTC m=+824.309561319 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-etc-swift") pod "swift-storage-0" (UID: "15dfcc2f-b58a-4238-a555-e0e5e4a05e4b") : configmap "swift-ring-files" not found Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.289513 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-config\") pod \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.289572 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqkh4\" (UniqueName: \"kubernetes.io/projected/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-kube-api-access-lqkh4\") pod \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.289618 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-ovsdbserver-sb\") pod \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.289637 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-dns-svc\") pod \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.289748 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-ovsdbserver-nb\") pod \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\" (UID: \"2766cdbc-f285-4d59-8fbf-bde633d4e4f8\") " Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.293657 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-kube-api-access-lqkh4" (OuterVolumeSpecName: "kube-api-access-lqkh4") pod "2766cdbc-f285-4d59-8fbf-bde633d4e4f8" (UID: "2766cdbc-f285-4d59-8fbf-bde633d4e4f8"). InnerVolumeSpecName "kube-api-access-lqkh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.316316 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2766cdbc-f285-4d59-8fbf-bde633d4e4f8" (UID: "2766cdbc-f285-4d59-8fbf-bde633d4e4f8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.316365 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2766cdbc-f285-4d59-8fbf-bde633d4e4f8" (UID: "2766cdbc-f285-4d59-8fbf-bde633d4e4f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.317189 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-config" (OuterVolumeSpecName: "config") pod "2766cdbc-f285-4d59-8fbf-bde633d4e4f8" (UID: "2766cdbc-f285-4d59-8fbf-bde633d4e4f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.319791 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2766cdbc-f285-4d59-8fbf-bde633d4e4f8" (UID: "2766cdbc-f285-4d59-8fbf-bde633d4e4f8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.348641 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78586fdbff-kkx7p"] Oct 02 07:00:13 crc kubenswrapper[4786]: W1002 07:00:13.354635 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb03cd4b9_a66a_4a03_9915_88fe79682756.slice/crio-5aa4432afde477dd048a128391e2ce4001d982a43920192413f5e56d5d70a491 WatchSource:0}: Error finding container 5aa4432afde477dd048a128391e2ce4001d982a43920192413f5e56d5d70a491: Status 404 returned error can't find the container with id 5aa4432afde477dd048a128391e2ce4001d982a43920192413f5e56d5d70a491 Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.391263 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.391287 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.391296 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqkh4\" (UniqueName: \"kubernetes.io/projected/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-kube-api-access-lqkh4\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.391304 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.391312 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2766cdbc-f285-4d59-8fbf-bde633d4e4f8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.776521 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e4c62a8c-eb69-4f97-a299-a44f87315f81","Type":"ContainerStarted","Data":"85b8695e9ffdd577f4094e6890a46b1d81c99866bb4420daeb73e5fb8d979bf8"} Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.778858 4786 generic.go:334] "Generic (PLEG): container finished" podID="be5227a5-2fd9-4e47-ba91-e96d06ad9700" containerID="54514540eecb110e85b35ac5fb262524b7560fe577c8d952d213e3e578212b9d" exitCode=0 Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.778907 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c98cb8667-nk2d7" event={"ID":"be5227a5-2fd9-4e47-ba91-e96d06ad9700","Type":"ContainerDied","Data":"54514540eecb110e85b35ac5fb262524b7560fe577c8d952d213e3e578212b9d"} Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.780274 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"106dd908-3398-489e-a39f-b684b5eecd2b","Type":"ContainerStarted","Data":"496f1a06e160bf4b7aab93d9d5e67a3d5b6b60d41956681219ef36631e1f462f"} Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.782632 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" event={"ID":"2766cdbc-f285-4d59-8fbf-bde633d4e4f8","Type":"ContainerDied","Data":"408aa7bf520715ad04135903b779a8397160cf33482ecc40fb543f41f37f9c04"} Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.782643 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddc467bbc-h5rtm" Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.782678 4786 scope.go:117] "RemoveContainer" containerID="fafcba6321f43f09f3c3c130a192f7beabcf8530ac10c02395ed6ec7ee97d6f3" Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.783918 4786 generic.go:334] "Generic (PLEG): container finished" podID="b03cd4b9-a66a-4a03-9915-88fe79682756" containerID="bd11e9dc6bee9977aa1b7550185798138f2c971138849ee2678dd873d5f34cfd" exitCode=0 Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.783971 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" event={"ID":"b03cd4b9-a66a-4a03-9915-88fe79682756","Type":"ContainerDied","Data":"bd11e9dc6bee9977aa1b7550185798138f2c971138849ee2678dd873d5f34cfd"} Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.783988 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" event={"ID":"b03cd4b9-a66a-4a03-9915-88fe79682756","Type":"ContainerStarted","Data":"5aa4432afde477dd048a128391e2ce4001d982a43920192413f5e56d5d70a491"} Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.807516 4786 scope.go:117] "RemoveContainer" containerID="81b69da45d0750565259610267d64d14e4815cf53389fe2eed3f87361c09e081" Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.910631 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ddc467bbc-h5rtm"] Oct 02 07:00:13 crc kubenswrapper[4786]: I1002 07:00:13.918054 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ddc467bbc-h5rtm"] Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.024971 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c98cb8667-nk2d7" Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.107230 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be5227a5-2fd9-4e47-ba91-e96d06ad9700-dns-svc\") pod \"be5227a5-2fd9-4e47-ba91-e96d06ad9700\" (UID: \"be5227a5-2fd9-4e47-ba91-e96d06ad9700\") " Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.107286 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5brx\" (UniqueName: \"kubernetes.io/projected/be5227a5-2fd9-4e47-ba91-e96d06ad9700-kube-api-access-c5brx\") pod \"be5227a5-2fd9-4e47-ba91-e96d06ad9700\" (UID: \"be5227a5-2fd9-4e47-ba91-e96d06ad9700\") " Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.107337 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be5227a5-2fd9-4e47-ba91-e96d06ad9700-config\") pod \"be5227a5-2fd9-4e47-ba91-e96d06ad9700\" (UID: \"be5227a5-2fd9-4e47-ba91-e96d06ad9700\") " Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.110453 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be5227a5-2fd9-4e47-ba91-e96d06ad9700-kube-api-access-c5brx" (OuterVolumeSpecName: "kube-api-access-c5brx") pod "be5227a5-2fd9-4e47-ba91-e96d06ad9700" (UID: "be5227a5-2fd9-4e47-ba91-e96d06ad9700"). InnerVolumeSpecName "kube-api-access-c5brx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.120830 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be5227a5-2fd9-4e47-ba91-e96d06ad9700-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "be5227a5-2fd9-4e47-ba91-e96d06ad9700" (UID: "be5227a5-2fd9-4e47-ba91-e96d06ad9700"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.121258 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be5227a5-2fd9-4e47-ba91-e96d06ad9700-config" (OuterVolumeSpecName: "config") pod "be5227a5-2fd9-4e47-ba91-e96d06ad9700" (UID: "be5227a5-2fd9-4e47-ba91-e96d06ad9700"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.187171 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2766cdbc-f285-4d59-8fbf-bde633d4e4f8" path="/var/lib/kubelet/pods/2766cdbc-f285-4d59-8fbf-bde633d4e4f8/volumes" Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.209335 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-etc-swift\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.209411 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be5227a5-2fd9-4e47-ba91-e96d06ad9700-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.209423 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5brx\" (UniqueName: \"kubernetes.io/projected/be5227a5-2fd9-4e47-ba91-e96d06ad9700-kube-api-access-c5brx\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.209434 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be5227a5-2fd9-4e47-ba91-e96d06ad9700-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:14 crc kubenswrapper[4786]: E1002 07:00:14.209556 4786 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 07:00:14 crc kubenswrapper[4786]: E1002 07:00:14.209588 4786 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 07:00:14 crc kubenswrapper[4786]: E1002 07:00:14.209642 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-etc-swift podName:15dfcc2f-b58a-4238-a555-e0e5e4a05e4b nodeName:}" failed. No retries permitted until 2025-10-02 07:00:16.209627448 +0000 UTC m=+826.330810578 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-etc-swift") pod "swift-storage-0" (UID: "15dfcc2f-b58a-4238-a555-e0e5e4a05e4b") : configmap "swift-ring-files" not found Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.795587 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9","Type":"ContainerStarted","Data":"f550d45b9ed4a3b67aaacbd5f0860441d749d23993664a8f030d1f10157ce0a9"} Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.797201 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c98cb8667-nk2d7" Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.797197 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c98cb8667-nk2d7" event={"ID":"be5227a5-2fd9-4e47-ba91-e96d06ad9700","Type":"ContainerDied","Data":"5234241b94f3231218364508c16eccf7f318ea531f4b503967c105850de4fbd0"} Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.797373 4786 scope.go:117] "RemoveContainer" containerID="54514540eecb110e85b35ac5fb262524b7560fe577c8d952d213e3e578212b9d" Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.802450 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" event={"ID":"b03cd4b9-a66a-4a03-9915-88fe79682756","Type":"ContainerStarted","Data":"978d522489e0ca9091396050737187859bdcda14a7978f0915d297570da9e0cd"} Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.802561 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.803358 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94","Type":"ContainerStarted","Data":"aeadaf247e974c8121b9c913260efe601073c3a9a7d2a40863d88fcb35011d88"} Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.854459 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c98cb8667-nk2d7"] Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.859852 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c98cb8667-nk2d7"] Oct 02 07:00:14 crc kubenswrapper[4786]: I1002 07:00:14.862641 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" podStartSLOduration=3.862626394 podStartE2EDuration="3.862626394s" podCreationTimestamp="2025-10-02 07:00:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:00:14.856478922 +0000 UTC m=+824.977662063" watchObservedRunningTime="2025-10-02 07:00:14.862626394 +0000 UTC m=+824.983809525" Oct 02 07:00:15 crc kubenswrapper[4786]: I1002 07:00:15.810436 4786 generic.go:334] "Generic (PLEG): container finished" podID="452e4743-083e-420b-9dfc-ea81e1376373" containerID="08abb83604cc040a09d3ef916ec4afdb5b35d5bd4a861e98c4d62890d7c8eb03" exitCode=0 Oct 02 07:00:15 crc kubenswrapper[4786]: I1002 07:00:15.810478 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lf7tj" event={"ID":"452e4743-083e-420b-9dfc-ea81e1376373","Type":"ContainerDied","Data":"08abb83604cc040a09d3ef916ec4afdb5b35d5bd4a861e98c4d62890d7c8eb03"} Oct 02 07:00:15 crc kubenswrapper[4786]: I1002 07:00:15.811888 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9","Type":"ContainerStarted","Data":"6e028e3475d892892df0d1a9194a12252971286a24709db799b2f69eb847fbd7"} Oct 02 07:00:15 crc kubenswrapper[4786]: I1002 07:00:15.839464 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.089250331 podStartE2EDuration="59.839448712s" podCreationTimestamp="2025-10-02 06:59:16 +0000 UTC" firstStartedPulling="2025-10-02 06:59:18.831373469 +0000 UTC m=+768.952556601" lastFinishedPulling="2025-10-02 07:00:14.581571851 +0000 UTC m=+824.702754982" observedRunningTime="2025-10-02 07:00:15.834828889 +0000 UTC m=+825.956012030" watchObservedRunningTime="2025-10-02 07:00:15.839448712 +0000 UTC m=+825.960631843" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.186064 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be5227a5-2fd9-4e47-ba91-e96d06ad9700" path="/var/lib/kubelet/pods/be5227a5-2fd9-4e47-ba91-e96d06ad9700/volumes" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.248123 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-etc-swift\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:16 crc kubenswrapper[4786]: E1002 07:00:16.248266 4786 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 07:00:16 crc kubenswrapper[4786]: E1002 07:00:16.248279 4786 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 07:00:16 crc kubenswrapper[4786]: E1002 07:00:16.248321 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-etc-swift podName:15dfcc2f-b58a-4238-a555-e0e5e4a05e4b nodeName:}" failed. No retries permitted until 2025-10-02 07:00:20.248310669 +0000 UTC m=+830.369493800 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-etc-swift") pod "swift-storage-0" (UID: "15dfcc2f-b58a-4238-a555-e0e5e4a05e4b") : configmap "swift-ring-files" not found Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.422755 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mjfnf"] Oct 02 07:00:16 crc kubenswrapper[4786]: E1002 07:00:16.423021 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2766cdbc-f285-4d59-8fbf-bde633d4e4f8" containerName="dnsmasq-dns" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.423038 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2766cdbc-f285-4d59-8fbf-bde633d4e4f8" containerName="dnsmasq-dns" Oct 02 07:00:16 crc kubenswrapper[4786]: E1002 07:00:16.423054 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5227a5-2fd9-4e47-ba91-e96d06ad9700" containerName="init" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.423059 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5227a5-2fd9-4e47-ba91-e96d06ad9700" containerName="init" Oct 02 07:00:16 crc kubenswrapper[4786]: E1002 07:00:16.423070 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2766cdbc-f285-4d59-8fbf-bde633d4e4f8" containerName="init" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.423075 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2766cdbc-f285-4d59-8fbf-bde633d4e4f8" containerName="init" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.423227 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2766cdbc-f285-4d59-8fbf-bde633d4e4f8" containerName="dnsmasq-dns" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.423255 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="be5227a5-2fd9-4e47-ba91-e96d06ad9700" containerName="init" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.423715 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.425194 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.425800 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.426097 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.429269 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mjfnf"] Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.552264 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/009aff59-5299-4ae9-a997-99c88f103a92-etc-swift\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.552334 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/009aff59-5299-4ae9-a997-99c88f103a92-dispersionconf\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.552352 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009aff59-5299-4ae9-a997-99c88f103a92-combined-ca-bundle\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.552471 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/009aff59-5299-4ae9-a997-99c88f103a92-ring-data-devices\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.552528 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/009aff59-5299-4ae9-a997-99c88f103a92-scripts\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.552752 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gtbl\" (UniqueName: \"kubernetes.io/projected/009aff59-5299-4ae9-a997-99c88f103a92-kube-api-access-8gtbl\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.552811 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/009aff59-5299-4ae9-a997-99c88f103a92-swiftconf\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.653976 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gtbl\" (UniqueName: \"kubernetes.io/projected/009aff59-5299-4ae9-a997-99c88f103a92-kube-api-access-8gtbl\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.654031 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/009aff59-5299-4ae9-a997-99c88f103a92-swiftconf\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.654086 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/009aff59-5299-4ae9-a997-99c88f103a92-etc-swift\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.654135 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/009aff59-5299-4ae9-a997-99c88f103a92-dispersionconf\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.654150 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009aff59-5299-4ae9-a997-99c88f103a92-combined-ca-bundle\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.654171 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/009aff59-5299-4ae9-a997-99c88f103a92-ring-data-devices\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.654193 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/009aff59-5299-4ae9-a997-99c88f103a92-scripts\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.654514 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/009aff59-5299-4ae9-a997-99c88f103a92-etc-swift\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.654807 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/009aff59-5299-4ae9-a997-99c88f103a92-scripts\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.654900 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/009aff59-5299-4ae9-a997-99c88f103a92-ring-data-devices\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.658517 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/009aff59-5299-4ae9-a997-99c88f103a92-swiftconf\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.658602 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/009aff59-5299-4ae9-a997-99c88f103a92-dispersionconf\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.658643 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009aff59-5299-4ae9-a997-99c88f103a92-combined-ca-bundle\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.666595 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gtbl\" (UniqueName: \"kubernetes.io/projected/009aff59-5299-4ae9-a997-99c88f103a92-kube-api-access-8gtbl\") pod \"swift-ring-rebalance-mjfnf\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.737376 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.835982 4786 generic.go:334] "Generic (PLEG): container finished" podID="106dd908-3398-489e-a39f-b684b5eecd2b" containerID="496f1a06e160bf4b7aab93d9d5e67a3d5b6b60d41956681219ef36631e1f462f" exitCode=0 Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.836052 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"106dd908-3398-489e-a39f-b684b5eecd2b","Type":"ContainerDied","Data":"496f1a06e160bf4b7aab93d9d5e67a3d5b6b60d41956681219ef36631e1f462f"} Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.844100 4786 generic.go:334] "Generic (PLEG): container finished" podID="e4c62a8c-eb69-4f97-a299-a44f87315f81" containerID="85b8695e9ffdd577f4094e6890a46b1d81c99866bb4420daeb73e5fb8d979bf8" exitCode=0 Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.844150 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e4c62a8c-eb69-4f97-a299-a44f87315f81","Type":"ContainerDied","Data":"85b8695e9ffdd577f4094e6890a46b1d81c99866bb4420daeb73e5fb8d979bf8"} Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.847257 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lf7tj" event={"ID":"452e4743-083e-420b-9dfc-ea81e1376373","Type":"ContainerStarted","Data":"e06939b82542af45ddb1aa1cc3642a86b113a58263fe34e37ac5b3946e594b1b"} Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.847286 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lf7tj" event={"ID":"452e4743-083e-420b-9dfc-ea81e1376373","Type":"ContainerStarted","Data":"20e7fe2c7e5a300e383a3ff4ffe204449fa5466b6aca784133013a5af114ee59"} Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.847311 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.847403 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 07:00:16 crc kubenswrapper[4786]: I1002 07:00:16.882857 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-lf7tj" podStartSLOduration=3.434284676 podStartE2EDuration="1m0.882842863s" podCreationTimestamp="2025-10-02 06:59:16 +0000 UTC" firstStartedPulling="2025-10-02 06:59:17.143047227 +0000 UTC m=+767.264230358" lastFinishedPulling="2025-10-02 07:00:14.591605413 +0000 UTC m=+824.712788545" observedRunningTime="2025-10-02 07:00:16.879703619 +0000 UTC m=+827.000886760" watchObservedRunningTime="2025-10-02 07:00:16.882842863 +0000 UTC m=+827.004025994" Oct 02 07:00:17 crc kubenswrapper[4786]: I1002 07:00:17.108498 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mjfnf"] Oct 02 07:00:17 crc kubenswrapper[4786]: W1002 07:00:17.113661 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod009aff59_5299_4ae9_a997_99c88f103a92.slice/crio-2b5d241e5ca0a49ef16a8bc25b0546c19773a48ef24d9c0d4e74e236f56e391a WatchSource:0}: Error finding container 2b5d241e5ca0a49ef16a8bc25b0546c19773a48ef24d9c0d4e74e236f56e391a: Status 404 returned error can't find the container with id 2b5d241e5ca0a49ef16a8bc25b0546c19773a48ef24d9c0d4e74e236f56e391a Oct 02 07:00:17 crc kubenswrapper[4786]: I1002 07:00:17.115771 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 07:00:17 crc kubenswrapper[4786]: I1002 07:00:17.855117 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"106dd908-3398-489e-a39f-b684b5eecd2b","Type":"ContainerStarted","Data":"03ca96bb10b616529064a229c4ed7c671849e6826bfe88785ecc46315e9597ec"} Oct 02 07:00:17 crc kubenswrapper[4786]: I1002 07:00:17.856150 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mjfnf" event={"ID":"009aff59-5299-4ae9-a997-99c88f103a92","Type":"ContainerStarted","Data":"2b5d241e5ca0a49ef16a8bc25b0546c19773a48ef24d9c0d4e74e236f56e391a"} Oct 02 07:00:17 crc kubenswrapper[4786]: I1002 07:00:17.857455 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e4c62a8c-eb69-4f97-a299-a44f87315f81","Type":"ContainerStarted","Data":"f515e19cd748e93d37d32176a03dc2574494b18956d748b35e61175f1d5f2ffa"} Oct 02 07:00:17 crc kubenswrapper[4786]: I1002 07:00:17.871258 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.002796339 podStartE2EDuration="1m10.871244717s" podCreationTimestamp="2025-10-02 06:59:07 +0000 UTC" firstStartedPulling="2025-10-02 06:59:09.128332417 +0000 UTC m=+759.249515547" lastFinishedPulling="2025-10-02 07:00:12.996780794 +0000 UTC m=+823.117963925" observedRunningTime="2025-10-02 07:00:17.868265985 +0000 UTC m=+827.989449116" watchObservedRunningTime="2025-10-02 07:00:17.871244717 +0000 UTC m=+827.992427849" Oct 02 07:00:17 crc kubenswrapper[4786]: I1002 07:00:17.884777 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.529678183 podStartE2EDuration="1m10.88476528s" podCreationTimestamp="2025-10-02 06:59:07 +0000 UTC" firstStartedPulling="2025-10-02 06:59:09.650462468 +0000 UTC m=+759.771645599" lastFinishedPulling="2025-10-02 07:00:13.005549565 +0000 UTC m=+823.126732696" observedRunningTime="2025-10-02 07:00:17.881746152 +0000 UTC m=+828.002929293" watchObservedRunningTime="2025-10-02 07:00:17.88476528 +0000 UTC m=+828.005948402" Oct 02 07:00:18 crc kubenswrapper[4786]: I1002 07:00:18.297892 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 02 07:00:18 crc kubenswrapper[4786]: I1002 07:00:18.297934 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 02 07:00:18 crc kubenswrapper[4786]: I1002 07:00:18.326884 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 02 07:00:18 crc kubenswrapper[4786]: I1002 07:00:18.718166 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 02 07:00:18 crc kubenswrapper[4786]: I1002 07:00:18.718203 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 02 07:00:19 crc kubenswrapper[4786]: I1002 07:00:19.178843 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 02 07:00:19 crc kubenswrapper[4786]: I1002 07:00:19.178875 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 02 07:00:20 crc kubenswrapper[4786]: I1002 07:00:20.298781 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-etc-swift\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:20 crc kubenswrapper[4786]: E1002 07:00:20.299523 4786 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 07:00:20 crc kubenswrapper[4786]: E1002 07:00:20.299564 4786 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 07:00:20 crc kubenswrapper[4786]: E1002 07:00:20.299621 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-etc-swift podName:15dfcc2f-b58a-4238-a555-e0e5e4a05e4b nodeName:}" failed. No retries permitted until 2025-10-02 07:00:28.29959955 +0000 UTC m=+838.420782680 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-etc-swift") pod "swift-storage-0" (UID: "15dfcc2f-b58a-4238-a555-e0e5e4a05e4b") : configmap "swift-ring-files" not found Oct 02 07:00:21 crc kubenswrapper[4786]: I1002 07:00:21.740854 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:21 crc kubenswrapper[4786]: I1002 07:00:21.777536 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c7fbc4849-8ffsc"] Oct 02 07:00:21 crc kubenswrapper[4786]: I1002 07:00:21.777761 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" podUID="b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8" containerName="dnsmasq-dns" containerID="cri-o://ed1c78d6a55786bfe663fcf99b9b4c7ad3e4014b1ecb21e189b03627de6dbb28" gracePeriod=10 Oct 02 07:00:21 crc kubenswrapper[4786]: I1002 07:00:21.858569 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" podUID="b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.107:5353: connect: connection refused" Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.127435 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.230975 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-config\") pod \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\" (UID: \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\") " Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.231038 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-dns-svc\") pod \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\" (UID: \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\") " Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.231103 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-ovsdbserver-nb\") pod \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\" (UID: \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\") " Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.231154 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppgrd\" (UniqueName: \"kubernetes.io/projected/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-kube-api-access-ppgrd\") pod \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\" (UID: \"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8\") " Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.234113 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-kube-api-access-ppgrd" (OuterVolumeSpecName: "kube-api-access-ppgrd") pod "b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8" (UID: "b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8"). InnerVolumeSpecName "kube-api-access-ppgrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.260590 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8" (UID: "b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.260614 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8" (UID: "b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.268071 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-config" (OuterVolumeSpecName: "config") pod "b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8" (UID: "b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.332537 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.332563 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.332572 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.332583 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppgrd\" (UniqueName: \"kubernetes.io/projected/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8-kube-api-access-ppgrd\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.763193 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.794566 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.891244 4786 generic.go:334] "Generic (PLEG): container finished" podID="b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8" containerID="ed1c78d6a55786bfe663fcf99b9b4c7ad3e4014b1ecb21e189b03627de6dbb28" exitCode=0 Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.891292 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" event={"ID":"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8","Type":"ContainerDied","Data":"ed1c78d6a55786bfe663fcf99b9b4c7ad3e4014b1ecb21e189b03627de6dbb28"} Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.891320 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" event={"ID":"b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8","Type":"ContainerDied","Data":"27b35001435fe94deb2e5b5377c357a0f71e6fe677e1b8893e68b6c992c18f87"} Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.891335 4786 scope.go:117] "RemoveContainer" containerID="ed1c78d6a55786bfe663fcf99b9b4c7ad3e4014b1ecb21e189b03627de6dbb28" Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.891446 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7fbc4849-8ffsc" Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.896428 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mjfnf" event={"ID":"009aff59-5299-4ae9-a997-99c88f103a92","Type":"ContainerStarted","Data":"cf7410bd94fe1858ad424da921d3a4a0af49756ff61c2581eabb9c1d35bc0373"} Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.910384 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-mjfnf" podStartSLOduration=2.251670247 podStartE2EDuration="6.910370327s" podCreationTimestamp="2025-10-02 07:00:16 +0000 UTC" firstStartedPulling="2025-10-02 07:00:17.115547796 +0000 UTC m=+827.236730926" lastFinishedPulling="2025-10-02 07:00:21.774247874 +0000 UTC m=+831.895431006" observedRunningTime="2025-10-02 07:00:22.90520711 +0000 UTC m=+833.026390241" watchObservedRunningTime="2025-10-02 07:00:22.910370327 +0000 UTC m=+833.031553459" Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.916646 4786 scope.go:117] "RemoveContainer" containerID="febdb2f533381eaa005e1ea4282f17305e5bf3168adb6d7f60a340fb982b80e7" Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.922739 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c7fbc4849-8ffsc"] Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.926517 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c7fbc4849-8ffsc"] Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.932130 4786 scope.go:117] "RemoveContainer" containerID="ed1c78d6a55786bfe663fcf99b9b4c7ad3e4014b1ecb21e189b03627de6dbb28" Oct 02 07:00:22 crc kubenswrapper[4786]: E1002 07:00:22.932435 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed1c78d6a55786bfe663fcf99b9b4c7ad3e4014b1ecb21e189b03627de6dbb28\": container with ID starting with ed1c78d6a55786bfe663fcf99b9b4c7ad3e4014b1ecb21e189b03627de6dbb28 not found: ID does not exist" containerID="ed1c78d6a55786bfe663fcf99b9b4c7ad3e4014b1ecb21e189b03627de6dbb28" Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.932469 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1c78d6a55786bfe663fcf99b9b4c7ad3e4014b1ecb21e189b03627de6dbb28"} err="failed to get container status \"ed1c78d6a55786bfe663fcf99b9b4c7ad3e4014b1ecb21e189b03627de6dbb28\": rpc error: code = NotFound desc = could not find container \"ed1c78d6a55786bfe663fcf99b9b4c7ad3e4014b1ecb21e189b03627de6dbb28\": container with ID starting with ed1c78d6a55786bfe663fcf99b9b4c7ad3e4014b1ecb21e189b03627de6dbb28 not found: ID does not exist" Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.932489 4786 scope.go:117] "RemoveContainer" containerID="febdb2f533381eaa005e1ea4282f17305e5bf3168adb6d7f60a340fb982b80e7" Oct 02 07:00:22 crc kubenswrapper[4786]: E1002 07:00:22.932752 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"febdb2f533381eaa005e1ea4282f17305e5bf3168adb6d7f60a340fb982b80e7\": container with ID starting with febdb2f533381eaa005e1ea4282f17305e5bf3168adb6d7f60a340fb982b80e7 not found: ID does not exist" containerID="febdb2f533381eaa005e1ea4282f17305e5bf3168adb6d7f60a340fb982b80e7" Oct 02 07:00:22 crc kubenswrapper[4786]: I1002 07:00:22.932775 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"febdb2f533381eaa005e1ea4282f17305e5bf3168adb6d7f60a340fb982b80e7"} err="failed to get container status \"febdb2f533381eaa005e1ea4282f17305e5bf3168adb6d7f60a340fb982b80e7\": rpc error: code = NotFound desc = could not find container \"febdb2f533381eaa005e1ea4282f17305e5bf3168adb6d7f60a340fb982b80e7\": container with ID starting with febdb2f533381eaa005e1ea4282f17305e5bf3168adb6d7f60a340fb982b80e7 not found: ID does not exist" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.231571 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.267520 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="e4c62a8c-eb69-4f97-a299-a44f87315f81" containerName="galera" probeResult="failure" output=< Oct 02 07:00:23 crc kubenswrapper[4786]: wsrep_local_state_comment (Joined) differs from Synced Oct 02 07:00:23 crc kubenswrapper[4786]: > Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.322578 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.437138 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 02 07:00:23 crc kubenswrapper[4786]: E1002 07:00:23.437572 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8" containerName="dnsmasq-dns" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.437590 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8" containerName="dnsmasq-dns" Oct 02 07:00:23 crc kubenswrapper[4786]: E1002 07:00:23.437611 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8" containerName="init" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.437617 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8" containerName="init" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.437772 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8" containerName="dnsmasq-dns" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.438433 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.440064 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.440180 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-gwc4j" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.442017 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.442414 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.450948 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.547806 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0006e64-1f5a-4050-83f6-36ce34d68bf7-config\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.547859 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e0006e64-1f5a-4050-83f6-36ce34d68bf7-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.547889 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjcss\" (UniqueName: \"kubernetes.io/projected/e0006e64-1f5a-4050-83f6-36ce34d68bf7-kube-api-access-bjcss\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.547969 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0006e64-1f5a-4050-83f6-36ce34d68bf7-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.547994 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0006e64-1f5a-4050-83f6-36ce34d68bf7-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.548007 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0006e64-1f5a-4050-83f6-36ce34d68bf7-scripts\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.548023 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0006e64-1f5a-4050-83f6-36ce34d68bf7-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.649126 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0006e64-1f5a-4050-83f6-36ce34d68bf7-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.649163 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0006e64-1f5a-4050-83f6-36ce34d68bf7-scripts\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.649191 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0006e64-1f5a-4050-83f6-36ce34d68bf7-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.649255 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0006e64-1f5a-4050-83f6-36ce34d68bf7-config\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.649294 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e0006e64-1f5a-4050-83f6-36ce34d68bf7-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.649321 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjcss\" (UniqueName: \"kubernetes.io/projected/e0006e64-1f5a-4050-83f6-36ce34d68bf7-kube-api-access-bjcss\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.649423 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0006e64-1f5a-4050-83f6-36ce34d68bf7-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.650165 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0006e64-1f5a-4050-83f6-36ce34d68bf7-config\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.650348 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e0006e64-1f5a-4050-83f6-36ce34d68bf7-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.651460 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0006e64-1f5a-4050-83f6-36ce34d68bf7-scripts\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.654272 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0006e64-1f5a-4050-83f6-36ce34d68bf7-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.654408 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0006e64-1f5a-4050-83f6-36ce34d68bf7-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.662153 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0006e64-1f5a-4050-83f6-36ce34d68bf7-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.674560 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjcss\" (UniqueName: \"kubernetes.io/projected/e0006e64-1f5a-4050-83f6-36ce34d68bf7-kube-api-access-bjcss\") pod \"ovn-northd-0\" (UID: \"e0006e64-1f5a-4050-83f6-36ce34d68bf7\") " pod="openstack/ovn-northd-0" Oct 02 07:00:23 crc kubenswrapper[4786]: I1002 07:00:23.761841 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 07:00:24 crc kubenswrapper[4786]: I1002 07:00:24.122291 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 07:00:24 crc kubenswrapper[4786]: W1002 07:00:24.127330 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0006e64_1f5a_4050_83f6_36ce34d68bf7.slice/crio-b8812e0a687a59a933bb9107ad7e48c81fbb0281067d9199b5c9c4bafa3379f4 WatchSource:0}: Error finding container b8812e0a687a59a933bb9107ad7e48c81fbb0281067d9199b5c9c4bafa3379f4: Status 404 returned error can't find the container with id b8812e0a687a59a933bb9107ad7e48c81fbb0281067d9199b5c9c4bafa3379f4 Oct 02 07:00:24 crc kubenswrapper[4786]: I1002 07:00:24.187173 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8" path="/var/lib/kubelet/pods/b3bf385b-a0e8-4687-9bf5-5a76b1ece6e8/volumes" Oct 02 07:00:24 crc kubenswrapper[4786]: I1002 07:00:24.921848 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e0006e64-1f5a-4050-83f6-36ce34d68bf7","Type":"ContainerStarted","Data":"b8812e0a687a59a933bb9107ad7e48c81fbb0281067d9199b5c9c4bafa3379f4"} Oct 02 07:00:25 crc kubenswrapper[4786]: I1002 07:00:25.928624 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e0006e64-1f5a-4050-83f6-36ce34d68bf7","Type":"ContainerStarted","Data":"06061130c2a51134f67ba8bdf19653b5c9b25005cf2780d8a78a202bc3352f32"} Oct 02 07:00:26 crc kubenswrapper[4786]: I1002 07:00:26.936984 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e0006e64-1f5a-4050-83f6-36ce34d68bf7","Type":"ContainerStarted","Data":"6c61f899c058558858d5c7524c70e24fff2b2547926050f2360cfe68b116a993"} Oct 02 07:00:26 crc kubenswrapper[4786]: I1002 07:00:26.937313 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 02 07:00:26 crc kubenswrapper[4786]: I1002 07:00:26.951292 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.5269653869999997 podStartE2EDuration="3.951273751s" podCreationTimestamp="2025-10-02 07:00:23 +0000 UTC" firstStartedPulling="2025-10-02 07:00:24.129508494 +0000 UTC m=+834.250691625" lastFinishedPulling="2025-10-02 07:00:25.553816857 +0000 UTC m=+835.674999989" observedRunningTime="2025-10-02 07:00:26.948497972 +0000 UTC m=+837.069681123" watchObservedRunningTime="2025-10-02 07:00:26.951273751 +0000 UTC m=+837.072456872" Oct 02 07:00:27 crc kubenswrapper[4786]: I1002 07:00:27.944025 4786 generic.go:334] "Generic (PLEG): container finished" podID="009aff59-5299-4ae9-a997-99c88f103a92" containerID="cf7410bd94fe1858ad424da921d3a4a0af49756ff61c2581eabb9c1d35bc0373" exitCode=0 Oct 02 07:00:27 crc kubenswrapper[4786]: I1002 07:00:27.944092 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mjfnf" event={"ID":"009aff59-5299-4ae9-a997-99c88f103a92","Type":"ContainerDied","Data":"cf7410bd94fe1858ad424da921d3a4a0af49756ff61c2581eabb9c1d35bc0373"} Oct 02 07:00:27 crc kubenswrapper[4786]: I1002 07:00:27.945434 4786 generic.go:334] "Generic (PLEG): container finished" podID="7a34253b-beed-468f-8bad-82366a5eb5c3" containerID="38e87a1f59faf04a43c609aa662d827fd81aad0a9d27e60d93ad5dfece74d190" exitCode=0 Oct 02 07:00:27 crc kubenswrapper[4786]: I1002 07:00:27.945517 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7a34253b-beed-468f-8bad-82366a5eb5c3","Type":"ContainerDied","Data":"38e87a1f59faf04a43c609aa662d827fd81aad0a9d27e60d93ad5dfece74d190"} Oct 02 07:00:28 crc kubenswrapper[4786]: I1002 07:00:28.336510 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-etc-swift\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:28 crc kubenswrapper[4786]: I1002 07:00:28.341401 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/15dfcc2f-b58a-4238-a555-e0e5e4a05e4b-etc-swift\") pod \"swift-storage-0\" (UID: \"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b\") " pod="openstack/swift-storage-0" Oct 02 07:00:28 crc kubenswrapper[4786]: I1002 07:00:28.356051 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 02 07:00:28 crc kubenswrapper[4786]: I1002 07:00:28.790561 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 02 07:00:28 crc kubenswrapper[4786]: W1002 07:00:28.794834 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15dfcc2f_b58a_4238_a555_e0e5e4a05e4b.slice/crio-100d820972b5d115802ffba7d538eb093208a8c45367c45c6674cafce19ca19a WatchSource:0}: Error finding container 100d820972b5d115802ffba7d538eb093208a8c45367c45c6674cafce19ca19a: Status 404 returned error can't find the container with id 100d820972b5d115802ffba7d538eb093208a8c45367c45c6674cafce19ca19a Oct 02 07:00:28 crc kubenswrapper[4786]: I1002 07:00:28.953237 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7a34253b-beed-468f-8bad-82366a5eb5c3","Type":"ContainerStarted","Data":"270f654eca7e4dcacf7de9e7f064eb2b0631a12635565ed6f9453c98f853f3f4"} Oct 02 07:00:28 crc kubenswrapper[4786]: I1002 07:00:28.953496 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 02 07:00:28 crc kubenswrapper[4786]: I1002 07:00:28.954652 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b","Type":"ContainerStarted","Data":"100d820972b5d115802ffba7d538eb093208a8c45367c45c6674cafce19ca19a"} Oct 02 07:00:28 crc kubenswrapper[4786]: I1002 07:00:28.972463 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.525633938 podStartE2EDuration="1m24.972440711s" podCreationTimestamp="2025-10-02 06:59:04 +0000 UTC" firstStartedPulling="2025-10-02 06:59:06.661373876 +0000 UTC m=+756.782557007" lastFinishedPulling="2025-10-02 06:59:55.108180649 +0000 UTC m=+805.229363780" observedRunningTime="2025-10-02 07:00:28.970577701 +0000 UTC m=+839.091760842" watchObservedRunningTime="2025-10-02 07:00:28.972440711 +0000 UTC m=+839.093623842" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.219447 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.235379 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.356572 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/009aff59-5299-4ae9-a997-99c88f103a92-ring-data-devices\") pod \"009aff59-5299-4ae9-a997-99c88f103a92\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.356618 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/009aff59-5299-4ae9-a997-99c88f103a92-scripts\") pod \"009aff59-5299-4ae9-a997-99c88f103a92\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.356655 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gtbl\" (UniqueName: \"kubernetes.io/projected/009aff59-5299-4ae9-a997-99c88f103a92-kube-api-access-8gtbl\") pod \"009aff59-5299-4ae9-a997-99c88f103a92\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.356729 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009aff59-5299-4ae9-a997-99c88f103a92-combined-ca-bundle\") pod \"009aff59-5299-4ae9-a997-99c88f103a92\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.356778 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/009aff59-5299-4ae9-a997-99c88f103a92-etc-swift\") pod \"009aff59-5299-4ae9-a997-99c88f103a92\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.356837 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/009aff59-5299-4ae9-a997-99c88f103a92-dispersionconf\") pod \"009aff59-5299-4ae9-a997-99c88f103a92\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.356871 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/009aff59-5299-4ae9-a997-99c88f103a92-swiftconf\") pod \"009aff59-5299-4ae9-a997-99c88f103a92\" (UID: \"009aff59-5299-4ae9-a997-99c88f103a92\") " Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.357557 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/009aff59-5299-4ae9-a997-99c88f103a92-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "009aff59-5299-4ae9-a997-99c88f103a92" (UID: "009aff59-5299-4ae9-a997-99c88f103a92"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.357685 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/009aff59-5299-4ae9-a997-99c88f103a92-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "009aff59-5299-4ae9-a997-99c88f103a92" (UID: "009aff59-5299-4ae9-a997-99c88f103a92"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.363794 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/009aff59-5299-4ae9-a997-99c88f103a92-kube-api-access-8gtbl" (OuterVolumeSpecName: "kube-api-access-8gtbl") pod "009aff59-5299-4ae9-a997-99c88f103a92" (UID: "009aff59-5299-4ae9-a997-99c88f103a92"). InnerVolumeSpecName "kube-api-access-8gtbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.366294 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/009aff59-5299-4ae9-a997-99c88f103a92-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "009aff59-5299-4ae9-a997-99c88f103a92" (UID: "009aff59-5299-4ae9-a997-99c88f103a92"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.374565 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/009aff59-5299-4ae9-a997-99c88f103a92-scripts" (OuterVolumeSpecName: "scripts") pod "009aff59-5299-4ae9-a997-99c88f103a92" (UID: "009aff59-5299-4ae9-a997-99c88f103a92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.378234 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/009aff59-5299-4ae9-a997-99c88f103a92-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "009aff59-5299-4ae9-a997-99c88f103a92" (UID: "009aff59-5299-4ae9-a997-99c88f103a92"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.380214 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/009aff59-5299-4ae9-a997-99c88f103a92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "009aff59-5299-4ae9-a997-99c88f103a92" (UID: "009aff59-5299-4ae9-a997-99c88f103a92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.458933 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/009aff59-5299-4ae9-a997-99c88f103a92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.458963 4786 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/009aff59-5299-4ae9-a997-99c88f103a92-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.458973 4786 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/009aff59-5299-4ae9-a997-99c88f103a92-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.458982 4786 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/009aff59-5299-4ae9-a997-99c88f103a92-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.458994 4786 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/009aff59-5299-4ae9-a997-99c88f103a92-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.459002 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/009aff59-5299-4ae9-a997-99c88f103a92-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.459011 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gtbl\" (UniqueName: \"kubernetes.io/projected/009aff59-5299-4ae9-a997-99c88f103a92-kube-api-access-8gtbl\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.485886 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-s5xkn"] Oct 02 07:00:29 crc kubenswrapper[4786]: E1002 07:00:29.486280 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="009aff59-5299-4ae9-a997-99c88f103a92" containerName="swift-ring-rebalance" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.486299 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="009aff59-5299-4ae9-a997-99c88f103a92" containerName="swift-ring-rebalance" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.486502 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="009aff59-5299-4ae9-a997-99c88f103a92" containerName="swift-ring-rebalance" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.487124 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s5xkn" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.495331 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-s5xkn"] Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.561764 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h772b\" (UniqueName: \"kubernetes.io/projected/15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2-kube-api-access-h772b\") pod \"keystone-db-create-s5xkn\" (UID: \"15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2\") " pod="openstack/keystone-db-create-s5xkn" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.663988 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h772b\" (UniqueName: \"kubernetes.io/projected/15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2-kube-api-access-h772b\") pod \"keystone-db-create-s5xkn\" (UID: \"15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2\") " pod="openstack/keystone-db-create-s5xkn" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.678005 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h772b\" (UniqueName: \"kubernetes.io/projected/15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2-kube-api-access-h772b\") pod \"keystone-db-create-s5xkn\" (UID: \"15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2\") " pod="openstack/keystone-db-create-s5xkn" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.693074 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-7wctg"] Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.694066 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7wctg" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.701632 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7wctg"] Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.765594 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpctz\" (UniqueName: \"kubernetes.io/projected/a4e3d221-990a-4056-ae85-ebbc7fbca4a1-kube-api-access-bpctz\") pod \"placement-db-create-7wctg\" (UID: \"a4e3d221-990a-4056-ae85-ebbc7fbca4a1\") " pod="openstack/placement-db-create-7wctg" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.799281 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s5xkn" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.867210 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpctz\" (UniqueName: \"kubernetes.io/projected/a4e3d221-990a-4056-ae85-ebbc7fbca4a1-kube-api-access-bpctz\") pod \"placement-db-create-7wctg\" (UID: \"a4e3d221-990a-4056-ae85-ebbc7fbca4a1\") " pod="openstack/placement-db-create-7wctg" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.885116 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpctz\" (UniqueName: \"kubernetes.io/projected/a4e3d221-990a-4056-ae85-ebbc7fbca4a1-kube-api-access-bpctz\") pod \"placement-db-create-7wctg\" (UID: \"a4e3d221-990a-4056-ae85-ebbc7fbca4a1\") " pod="openstack/placement-db-create-7wctg" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.900865 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-phlf9"] Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.901655 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-phlf9" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.911269 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-phlf9"] Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.960969 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mjfnf" event={"ID":"009aff59-5299-4ae9-a997-99c88f103a92","Type":"ContainerDied","Data":"2b5d241e5ca0a49ef16a8bc25b0546c19773a48ef24d9c0d4e74e236f56e391a"} Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.961789 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b5d241e5ca0a49ef16a8bc25b0546c19773a48ef24d9c0d4e74e236f56e391a" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.960985 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mjfnf" Oct 02 07:00:29 crc kubenswrapper[4786]: I1002 07:00:29.968941 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trhrv\" (UniqueName: \"kubernetes.io/projected/4e2bd960-9783-448c-a27e-de71fc1e096d-kube-api-access-trhrv\") pod \"glance-db-create-phlf9\" (UID: \"4e2bd960-9783-448c-a27e-de71fc1e096d\") " pod="openstack/glance-db-create-phlf9" Oct 02 07:00:30 crc kubenswrapper[4786]: I1002 07:00:30.016530 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7wctg" Oct 02 07:00:30 crc kubenswrapper[4786]: I1002 07:00:30.070408 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trhrv\" (UniqueName: \"kubernetes.io/projected/4e2bd960-9783-448c-a27e-de71fc1e096d-kube-api-access-trhrv\") pod \"glance-db-create-phlf9\" (UID: \"4e2bd960-9783-448c-a27e-de71fc1e096d\") " pod="openstack/glance-db-create-phlf9" Oct 02 07:00:30 crc kubenswrapper[4786]: I1002 07:00:30.090308 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trhrv\" (UniqueName: \"kubernetes.io/projected/4e2bd960-9783-448c-a27e-de71fc1e096d-kube-api-access-trhrv\") pod \"glance-db-create-phlf9\" (UID: \"4e2bd960-9783-448c-a27e-de71fc1e096d\") " pod="openstack/glance-db-create-phlf9" Oct 02 07:00:30 crc kubenswrapper[4786]: I1002 07:00:30.201890 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-s5xkn"] Oct 02 07:00:30 crc kubenswrapper[4786]: I1002 07:00:30.218434 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-phlf9" Oct 02 07:00:30 crc kubenswrapper[4786]: I1002 07:00:30.395238 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7wctg"] Oct 02 07:00:30 crc kubenswrapper[4786]: I1002 07:00:30.612720 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-phlf9"] Oct 02 07:00:30 crc kubenswrapper[4786]: W1002 07:00:30.619670 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e2bd960_9783_448c_a27e_de71fc1e096d.slice/crio-8b3ef3e7d17150e31612e2b60416be1e8ed2d825f97adfc0d15f8f641248a6ca WatchSource:0}: Error finding container 8b3ef3e7d17150e31612e2b60416be1e8ed2d825f97adfc0d15f8f641248a6ca: Status 404 returned error can't find the container with id 8b3ef3e7d17150e31612e2b60416be1e8ed2d825f97adfc0d15f8f641248a6ca Oct 02 07:00:30 crc kubenswrapper[4786]: I1002 07:00:30.971920 4786 generic.go:334] "Generic (PLEG): container finished" podID="15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2" containerID="57e4e0e0c7ec8ca559d6754d40447456328fe97d2b5a55c0d79cb7ad60f75731" exitCode=0 Oct 02 07:00:30 crc kubenswrapper[4786]: I1002 07:00:30.972003 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s5xkn" event={"ID":"15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2","Type":"ContainerDied","Data":"57e4e0e0c7ec8ca559d6754d40447456328fe97d2b5a55c0d79cb7ad60f75731"} Oct 02 07:00:30 crc kubenswrapper[4786]: I1002 07:00:30.972035 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s5xkn" event={"ID":"15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2","Type":"ContainerStarted","Data":"cd654a87e3a78fe29198c5ba4136ddd3f73896c16b49e8c1c7f770811e8c24fc"} Oct 02 07:00:30 crc kubenswrapper[4786]: I1002 07:00:30.976274 4786 generic.go:334] "Generic (PLEG): container finished" podID="a4e3d221-990a-4056-ae85-ebbc7fbca4a1" containerID="5513e846693cb28e50278686823b04fe1cf590f356f9d1ac063bdd87bb8923c1" exitCode=0 Oct 02 07:00:30 crc kubenswrapper[4786]: I1002 07:00:30.976333 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7wctg" event={"ID":"a4e3d221-990a-4056-ae85-ebbc7fbca4a1","Type":"ContainerDied","Data":"5513e846693cb28e50278686823b04fe1cf590f356f9d1ac063bdd87bb8923c1"} Oct 02 07:00:30 crc kubenswrapper[4786]: I1002 07:00:30.976367 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7wctg" event={"ID":"a4e3d221-990a-4056-ae85-ebbc7fbca4a1","Type":"ContainerStarted","Data":"7d9a87c5cf7a4e57ffd90b88d66a3556bf9065efb0032d27a5ed24152f2a7d27"} Oct 02 07:00:30 crc kubenswrapper[4786]: I1002 07:00:30.978017 4786 generic.go:334] "Generic (PLEG): container finished" podID="4e2bd960-9783-448c-a27e-de71fc1e096d" containerID="6a8dec2b7a33f18fa0586287d623ca8664d99713e2fb2d55da33bdd48062d627" exitCode=0 Oct 02 07:00:30 crc kubenswrapper[4786]: I1002 07:00:30.978063 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-phlf9" event={"ID":"4e2bd960-9783-448c-a27e-de71fc1e096d","Type":"ContainerDied","Data":"6a8dec2b7a33f18fa0586287d623ca8664d99713e2fb2d55da33bdd48062d627"} Oct 02 07:00:30 crc kubenswrapper[4786]: I1002 07:00:30.978144 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-phlf9" event={"ID":"4e2bd960-9783-448c-a27e-de71fc1e096d","Type":"ContainerStarted","Data":"8b3ef3e7d17150e31612e2b60416be1e8ed2d825f97adfc0d15f8f641248a6ca"} Oct 02 07:00:31 crc kubenswrapper[4786]: I1002 07:00:31.987417 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b","Type":"ContainerStarted","Data":"325a3b6af3fb0eb6f9c7dda624a25465318af9fc1342edc10145f7e026fdecfc"} Oct 02 07:00:31 crc kubenswrapper[4786]: I1002 07:00:31.987777 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b","Type":"ContainerStarted","Data":"0e243a9bb55ed8695ec65f9830960b2a787849d9917e6965b987ab342f15703a"} Oct 02 07:00:31 crc kubenswrapper[4786]: I1002 07:00:31.987797 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b","Type":"ContainerStarted","Data":"e0233c117e706f031915c1321fff6fbc8703cb5cde1955d785b6c6416ecc0604"} Oct 02 07:00:31 crc kubenswrapper[4786]: I1002 07:00:31.987805 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b","Type":"ContainerStarted","Data":"56fc1caceb0dd43457379154a4a3f63afb610166f52c927c9db169fd0bb4d7e9"} Oct 02 07:00:32 crc kubenswrapper[4786]: I1002 07:00:32.292598 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-phlf9" Oct 02 07:00:32 crc kubenswrapper[4786]: I1002 07:00:32.396351 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7wctg" Oct 02 07:00:32 crc kubenswrapper[4786]: I1002 07:00:32.400276 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s5xkn" Oct 02 07:00:32 crc kubenswrapper[4786]: I1002 07:00:32.402190 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trhrv\" (UniqueName: \"kubernetes.io/projected/4e2bd960-9783-448c-a27e-de71fc1e096d-kube-api-access-trhrv\") pod \"4e2bd960-9783-448c-a27e-de71fc1e096d\" (UID: \"4e2bd960-9783-448c-a27e-de71fc1e096d\") " Oct 02 07:00:32 crc kubenswrapper[4786]: I1002 07:00:32.410230 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e2bd960-9783-448c-a27e-de71fc1e096d-kube-api-access-trhrv" (OuterVolumeSpecName: "kube-api-access-trhrv") pod "4e2bd960-9783-448c-a27e-de71fc1e096d" (UID: "4e2bd960-9783-448c-a27e-de71fc1e096d"). InnerVolumeSpecName "kube-api-access-trhrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:00:32 crc kubenswrapper[4786]: I1002 07:00:32.503673 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpctz\" (UniqueName: \"kubernetes.io/projected/a4e3d221-990a-4056-ae85-ebbc7fbca4a1-kube-api-access-bpctz\") pod \"a4e3d221-990a-4056-ae85-ebbc7fbca4a1\" (UID: \"a4e3d221-990a-4056-ae85-ebbc7fbca4a1\") " Oct 02 07:00:32 crc kubenswrapper[4786]: I1002 07:00:32.503743 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h772b\" (UniqueName: \"kubernetes.io/projected/15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2-kube-api-access-h772b\") pod \"15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2\" (UID: \"15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2\") " Oct 02 07:00:32 crc kubenswrapper[4786]: I1002 07:00:32.504227 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trhrv\" (UniqueName: \"kubernetes.io/projected/4e2bd960-9783-448c-a27e-de71fc1e096d-kube-api-access-trhrv\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:32 crc kubenswrapper[4786]: I1002 07:00:32.507581 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2-kube-api-access-h772b" (OuterVolumeSpecName: "kube-api-access-h772b") pod "15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2" (UID: "15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2"). InnerVolumeSpecName "kube-api-access-h772b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:00:32 crc kubenswrapper[4786]: I1002 07:00:32.507728 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4e3d221-990a-4056-ae85-ebbc7fbca4a1-kube-api-access-bpctz" (OuterVolumeSpecName: "kube-api-access-bpctz") pod "a4e3d221-990a-4056-ae85-ebbc7fbca4a1" (UID: "a4e3d221-990a-4056-ae85-ebbc7fbca4a1"). InnerVolumeSpecName "kube-api-access-bpctz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:00:32 crc kubenswrapper[4786]: I1002 07:00:32.605867 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpctz\" (UniqueName: \"kubernetes.io/projected/a4e3d221-990a-4056-ae85-ebbc7fbca4a1-kube-api-access-bpctz\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:32 crc kubenswrapper[4786]: I1002 07:00:32.605889 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h772b\" (UniqueName: \"kubernetes.io/projected/15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2-kube-api-access-h772b\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:32 crc kubenswrapper[4786]: I1002 07:00:32.994919 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-phlf9" event={"ID":"4e2bd960-9783-448c-a27e-de71fc1e096d","Type":"ContainerDied","Data":"8b3ef3e7d17150e31612e2b60416be1e8ed2d825f97adfc0d15f8f641248a6ca"} Oct 02 07:00:32 crc kubenswrapper[4786]: I1002 07:00:32.995253 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b3ef3e7d17150e31612e2b60416be1e8ed2d825f97adfc0d15f8f641248a6ca" Oct 02 07:00:32 crc kubenswrapper[4786]: I1002 07:00:32.994929 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-phlf9" Oct 02 07:00:32 crc kubenswrapper[4786]: I1002 07:00:32.999151 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s5xkn" event={"ID":"15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2","Type":"ContainerDied","Data":"cd654a87e3a78fe29198c5ba4136ddd3f73896c16b49e8c1c7f770811e8c24fc"} Oct 02 07:00:32 crc kubenswrapper[4786]: I1002 07:00:32.999176 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd654a87e3a78fe29198c5ba4136ddd3f73896c16b49e8c1c7f770811e8c24fc" Oct 02 07:00:32 crc kubenswrapper[4786]: I1002 07:00:32.999196 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s5xkn" Oct 02 07:00:33 crc kubenswrapper[4786]: I1002 07:00:33.000321 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7wctg" event={"ID":"a4e3d221-990a-4056-ae85-ebbc7fbca4a1","Type":"ContainerDied","Data":"7d9a87c5cf7a4e57ffd90b88d66a3556bf9065efb0032d27a5ed24152f2a7d27"} Oct 02 07:00:33 crc kubenswrapper[4786]: I1002 07:00:33.000342 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d9a87c5cf7a4e57ffd90b88d66a3556bf9065efb0032d27a5ed24152f2a7d27" Oct 02 07:00:33 crc kubenswrapper[4786]: I1002 07:00:33.000354 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7wctg" Oct 02 07:00:34 crc kubenswrapper[4786]: I1002 07:00:34.009993 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b","Type":"ContainerStarted","Data":"ce82f3125417d8e8fe798d849986b353247f5576c83295d1f3b5fcd445742e27"} Oct 02 07:00:34 crc kubenswrapper[4786]: I1002 07:00:34.010262 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b","Type":"ContainerStarted","Data":"ea49d322a08e281c07ea3280a7733b4655a2ef00ac43c360986b2ce36d093a47"} Oct 02 07:00:34 crc kubenswrapper[4786]: I1002 07:00:34.010275 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b","Type":"ContainerStarted","Data":"22d39df859d0cc59475a87f528a1c50fac66e3ab103270b492d7446cbb0f73f9"} Oct 02 07:00:34 crc kubenswrapper[4786]: I1002 07:00:34.010284 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b","Type":"ContainerStarted","Data":"210be830dbd6cfbf261efcaf37e81cdb908e5238f4e4f990e623733606a2961c"} Oct 02 07:00:35 crc kubenswrapper[4786]: I1002 07:00:35.025034 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b","Type":"ContainerStarted","Data":"9b5bc1001d4be0e05b93220090ce09ea4bc546347961eec144af564ea699c10e"} Oct 02 07:00:36 crc kubenswrapper[4786]: I1002 07:00:36.037303 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b","Type":"ContainerStarted","Data":"80031a087cbcda61c8796eca3b63dcaa86b4a76daabca774930ec18bb5150050"} Oct 02 07:00:36 crc kubenswrapper[4786]: I1002 07:00:36.037345 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b","Type":"ContainerStarted","Data":"6aefcd9d0d03a7f8f0d64c9335225bf735d9e704064c45749148772778dc7732"} Oct 02 07:00:36 crc kubenswrapper[4786]: I1002 07:00:36.037354 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b","Type":"ContainerStarted","Data":"56f9e1f24214af7df70f3a7b02e357d4349f5d6d68b07d912e91bc0746443a72"} Oct 02 07:00:36 crc kubenswrapper[4786]: I1002 07:00:36.465439 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-gl77n" podUID="d0b71af7-f7f4-45e8-b8f0-c7428f54a37d" containerName="ovn-controller" probeResult="failure" output=< Oct 02 07:00:36 crc kubenswrapper[4786]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 02 07:00:36 crc kubenswrapper[4786]: > Oct 02 07:00:37 crc kubenswrapper[4786]: I1002 07:00:37.045771 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b","Type":"ContainerStarted","Data":"cd9eea6f32dce556555279e1c6ad7f6fb2dfb48793af6b876bcea815a4e53f12"} Oct 02 07:00:37 crc kubenswrapper[4786]: I1002 07:00:37.046012 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b","Type":"ContainerStarted","Data":"4fb49c668c8d06d42850ead56021e5e0464b17f592ddbadd2d191f10ebd9fcee"} Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.055756 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"15dfcc2f-b58a-4238-a555-e0e5e4a05e4b","Type":"ContainerStarted","Data":"2c50e07b7c52e5eef5abff1c884602b6f0d74a4d466041491b79d3029f9d92bc"} Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.079564 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.13818347 podStartE2EDuration="27.079550696s" podCreationTimestamp="2025-10-02 07:00:11 +0000 UTC" firstStartedPulling="2025-10-02 07:00:28.796746809 +0000 UTC m=+838.917929940" lastFinishedPulling="2025-10-02 07:00:34.738114035 +0000 UTC m=+844.859297166" observedRunningTime="2025-10-02 07:00:38.078901513 +0000 UTC m=+848.200084655" watchObservedRunningTime="2025-10-02 07:00:38.079550696 +0000 UTC m=+848.200733827" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.277743 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85f4f889b9-jz4kr"] Oct 02 07:00:38 crc kubenswrapper[4786]: E1002 07:00:38.278020 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4e3d221-990a-4056-ae85-ebbc7fbca4a1" containerName="mariadb-database-create" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.278037 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4e3d221-990a-4056-ae85-ebbc7fbca4a1" containerName="mariadb-database-create" Oct 02 07:00:38 crc kubenswrapper[4786]: E1002 07:00:38.278056 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e2bd960-9783-448c-a27e-de71fc1e096d" containerName="mariadb-database-create" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.278063 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2bd960-9783-448c-a27e-de71fc1e096d" containerName="mariadb-database-create" Oct 02 07:00:38 crc kubenswrapper[4786]: E1002 07:00:38.278072 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2" containerName="mariadb-database-create" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.278078 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2" containerName="mariadb-database-create" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.278236 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2" containerName="mariadb-database-create" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.278267 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e2bd960-9783-448c-a27e-de71fc1e096d" containerName="mariadb-database-create" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.278279 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4e3d221-990a-4056-ae85-ebbc7fbca4a1" containerName="mariadb-database-create" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.279033 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.280760 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.295203 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f4f889b9-jz4kr"] Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.393044 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-dns-swift-storage-0\") pod \"dnsmasq-dns-85f4f889b9-jz4kr\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.393118 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-ovsdbserver-sb\") pod \"dnsmasq-dns-85f4f889b9-jz4kr\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.393148 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-ovsdbserver-nb\") pod \"dnsmasq-dns-85f4f889b9-jz4kr\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.393172 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-dns-svc\") pod \"dnsmasq-dns-85f4f889b9-jz4kr\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.393208 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-config\") pod \"dnsmasq-dns-85f4f889b9-jz4kr\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.393337 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wgzs\" (UniqueName: \"kubernetes.io/projected/0b02fae5-2435-4676-983f-f60bcd4d58a0-kube-api-access-7wgzs\") pod \"dnsmasq-dns-85f4f889b9-jz4kr\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.495145 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-dns-svc\") pod \"dnsmasq-dns-85f4f889b9-jz4kr\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.495198 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-config\") pod \"dnsmasq-dns-85f4f889b9-jz4kr\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.495237 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wgzs\" (UniqueName: \"kubernetes.io/projected/0b02fae5-2435-4676-983f-f60bcd4d58a0-kube-api-access-7wgzs\") pod \"dnsmasq-dns-85f4f889b9-jz4kr\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.495294 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-dns-swift-storage-0\") pod \"dnsmasq-dns-85f4f889b9-jz4kr\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.495328 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-ovsdbserver-sb\") pod \"dnsmasq-dns-85f4f889b9-jz4kr\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.495438 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-ovsdbserver-nb\") pod \"dnsmasq-dns-85f4f889b9-jz4kr\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.495967 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-dns-svc\") pod \"dnsmasq-dns-85f4f889b9-jz4kr\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.496191 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-ovsdbserver-sb\") pod \"dnsmasq-dns-85f4f889b9-jz4kr\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.496195 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-config\") pod \"dnsmasq-dns-85f4f889b9-jz4kr\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.496331 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-ovsdbserver-nb\") pod \"dnsmasq-dns-85f4f889b9-jz4kr\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.496614 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-dns-swift-storage-0\") pod \"dnsmasq-dns-85f4f889b9-jz4kr\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.509897 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wgzs\" (UniqueName: \"kubernetes.io/projected/0b02fae5-2435-4676-983f-f60bcd4d58a0-kube-api-access-7wgzs\") pod \"dnsmasq-dns-85f4f889b9-jz4kr\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.595614 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.799735 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 02 07:00:38 crc kubenswrapper[4786]: I1002 07:00:38.948450 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f4f889b9-jz4kr"] Oct 02 07:00:38 crc kubenswrapper[4786]: W1002 07:00:38.951486 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b02fae5_2435_4676_983f_f60bcd4d58a0.slice/crio-250afb4ced759ceb8348c87ff83bfabec6918a9abc8f8320bcae4e24fe034489 WatchSource:0}: Error finding container 250afb4ced759ceb8348c87ff83bfabec6918a9abc8f8320bcae4e24fe034489: Status 404 returned error can't find the container with id 250afb4ced759ceb8348c87ff83bfabec6918a9abc8f8320bcae4e24fe034489 Oct 02 07:00:39 crc kubenswrapper[4786]: I1002 07:00:39.061305 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" event={"ID":"0b02fae5-2435-4676-983f-f60bcd4d58a0","Type":"ContainerStarted","Data":"6d70055c5489d28004e433bdf8256a857257c5f80b1f0b521ba0a02f1ed82edd"} Oct 02 07:00:39 crc kubenswrapper[4786]: I1002 07:00:39.061340 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" event={"ID":"0b02fae5-2435-4676-983f-f60bcd4d58a0","Type":"ContainerStarted","Data":"250afb4ced759ceb8348c87ff83bfabec6918a9abc8f8320bcae4e24fe034489"} Oct 02 07:00:39 crc kubenswrapper[4786]: I1002 07:00:39.530855 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5002-account-create-hczhb"] Oct 02 07:00:39 crc kubenswrapper[4786]: I1002 07:00:39.531786 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5002-account-create-hczhb" Oct 02 07:00:39 crc kubenswrapper[4786]: I1002 07:00:39.533298 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 02 07:00:39 crc kubenswrapper[4786]: I1002 07:00:39.536469 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5002-account-create-hczhb"] Oct 02 07:00:39 crc kubenswrapper[4786]: I1002 07:00:39.618728 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd6fb\" (UniqueName: \"kubernetes.io/projected/aa4cb6ec-a541-4d78-8410-331cd7066a02-kube-api-access-dd6fb\") pod \"keystone-5002-account-create-hczhb\" (UID: \"aa4cb6ec-a541-4d78-8410-331cd7066a02\") " pod="openstack/keystone-5002-account-create-hczhb" Oct 02 07:00:39 crc kubenswrapper[4786]: I1002 07:00:39.720080 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd6fb\" (UniqueName: \"kubernetes.io/projected/aa4cb6ec-a541-4d78-8410-331cd7066a02-kube-api-access-dd6fb\") pod \"keystone-5002-account-create-hczhb\" (UID: \"aa4cb6ec-a541-4d78-8410-331cd7066a02\") " pod="openstack/keystone-5002-account-create-hczhb" Oct 02 07:00:39 crc kubenswrapper[4786]: I1002 07:00:39.733875 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd6fb\" (UniqueName: \"kubernetes.io/projected/aa4cb6ec-a541-4d78-8410-331cd7066a02-kube-api-access-dd6fb\") pod \"keystone-5002-account-create-hczhb\" (UID: \"aa4cb6ec-a541-4d78-8410-331cd7066a02\") " pod="openstack/keystone-5002-account-create-hczhb" Oct 02 07:00:39 crc kubenswrapper[4786]: I1002 07:00:39.832158 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9870-account-create-mcqqp"] Oct 02 07:00:39 crc kubenswrapper[4786]: I1002 07:00:39.832987 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9870-account-create-mcqqp" Oct 02 07:00:39 crc kubenswrapper[4786]: I1002 07:00:39.834957 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 02 07:00:39 crc kubenswrapper[4786]: I1002 07:00:39.838004 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9870-account-create-mcqqp"] Oct 02 07:00:39 crc kubenswrapper[4786]: I1002 07:00:39.854845 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5002-account-create-hczhb" Oct 02 07:00:39 crc kubenswrapper[4786]: I1002 07:00:39.923313 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nxrp\" (UniqueName: \"kubernetes.io/projected/ee69903b-44e4-4424-b9ff-9b41438de93f-kube-api-access-8nxrp\") pod \"placement-9870-account-create-mcqqp\" (UID: \"ee69903b-44e4-4424-b9ff-9b41438de93f\") " pod="openstack/placement-9870-account-create-mcqqp" Oct 02 07:00:40 crc kubenswrapper[4786]: I1002 07:00:40.029204 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nxrp\" (UniqueName: \"kubernetes.io/projected/ee69903b-44e4-4424-b9ff-9b41438de93f-kube-api-access-8nxrp\") pod \"placement-9870-account-create-mcqqp\" (UID: \"ee69903b-44e4-4424-b9ff-9b41438de93f\") " pod="openstack/placement-9870-account-create-mcqqp" Oct 02 07:00:40 crc kubenswrapper[4786]: I1002 07:00:40.032380 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-71df-account-create-8kpnq"] Oct 02 07:00:40 crc kubenswrapper[4786]: I1002 07:00:40.033312 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-71df-account-create-8kpnq" Oct 02 07:00:40 crc kubenswrapper[4786]: I1002 07:00:40.034547 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 02 07:00:40 crc kubenswrapper[4786]: I1002 07:00:40.041977 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-71df-account-create-8kpnq"] Oct 02 07:00:40 crc kubenswrapper[4786]: I1002 07:00:40.044510 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nxrp\" (UniqueName: \"kubernetes.io/projected/ee69903b-44e4-4424-b9ff-9b41438de93f-kube-api-access-8nxrp\") pod \"placement-9870-account-create-mcqqp\" (UID: \"ee69903b-44e4-4424-b9ff-9b41438de93f\") " pod="openstack/placement-9870-account-create-mcqqp" Oct 02 07:00:40 crc kubenswrapper[4786]: I1002 07:00:40.069105 4786 generic.go:334] "Generic (PLEG): container finished" podID="0b02fae5-2435-4676-983f-f60bcd4d58a0" containerID="6d70055c5489d28004e433bdf8256a857257c5f80b1f0b521ba0a02f1ed82edd" exitCode=0 Oct 02 07:00:40 crc kubenswrapper[4786]: I1002 07:00:40.069141 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" event={"ID":"0b02fae5-2435-4676-983f-f60bcd4d58a0","Type":"ContainerDied","Data":"6d70055c5489d28004e433bdf8256a857257c5f80b1f0b521ba0a02f1ed82edd"} Oct 02 07:00:40 crc kubenswrapper[4786]: I1002 07:00:40.130438 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv4rc\" (UniqueName: \"kubernetes.io/projected/efc4fa25-9013-4615-867d-97e8df05d30f-kube-api-access-rv4rc\") pod \"glance-71df-account-create-8kpnq\" (UID: \"efc4fa25-9013-4615-867d-97e8df05d30f\") " pod="openstack/glance-71df-account-create-8kpnq" Oct 02 07:00:40 crc kubenswrapper[4786]: I1002 07:00:40.147051 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9870-account-create-mcqqp" Oct 02 07:00:40 crc kubenswrapper[4786]: I1002 07:00:40.220677 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5002-account-create-hczhb"] Oct 02 07:00:40 crc kubenswrapper[4786]: W1002 07:00:40.227517 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa4cb6ec_a541_4d78_8410_331cd7066a02.slice/crio-4869f66b99e08b2dc86d6faff68cf578a1359a9599cc0fdb66626620e5fa6dee WatchSource:0}: Error finding container 4869f66b99e08b2dc86d6faff68cf578a1359a9599cc0fdb66626620e5fa6dee: Status 404 returned error can't find the container with id 4869f66b99e08b2dc86d6faff68cf578a1359a9599cc0fdb66626620e5fa6dee Oct 02 07:00:40 crc kubenswrapper[4786]: I1002 07:00:40.235770 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv4rc\" (UniqueName: \"kubernetes.io/projected/efc4fa25-9013-4615-867d-97e8df05d30f-kube-api-access-rv4rc\") pod \"glance-71df-account-create-8kpnq\" (UID: \"efc4fa25-9013-4615-867d-97e8df05d30f\") " pod="openstack/glance-71df-account-create-8kpnq" Oct 02 07:00:40 crc kubenswrapper[4786]: I1002 07:00:40.249703 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv4rc\" (UniqueName: \"kubernetes.io/projected/efc4fa25-9013-4615-867d-97e8df05d30f-kube-api-access-rv4rc\") pod \"glance-71df-account-create-8kpnq\" (UID: \"efc4fa25-9013-4615-867d-97e8df05d30f\") " pod="openstack/glance-71df-account-create-8kpnq" Oct 02 07:00:40 crc kubenswrapper[4786]: I1002 07:00:40.374457 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-71df-account-create-8kpnq" Oct 02 07:00:40 crc kubenswrapper[4786]: I1002 07:00:40.511453 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9870-account-create-mcqqp"] Oct 02 07:00:40 crc kubenswrapper[4786]: W1002 07:00:40.518528 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee69903b_44e4_4424_b9ff_9b41438de93f.slice/crio-c9b8c7052892e001def18951d662683979c5ddb40ded4ceb6218b41b7c6d83d4 WatchSource:0}: Error finding container c9b8c7052892e001def18951d662683979c5ddb40ded4ceb6218b41b7c6d83d4: Status 404 returned error can't find the container with id c9b8c7052892e001def18951d662683979c5ddb40ded4ceb6218b41b7c6d83d4 Oct 02 07:00:40 crc kubenswrapper[4786]: I1002 07:00:40.717242 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-71df-account-create-8kpnq"] Oct 02 07:00:40 crc kubenswrapper[4786]: W1002 07:00:40.756136 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefc4fa25_9013_4615_867d_97e8df05d30f.slice/crio-ab55c0ead9213886cc4d0bd1434a30dcd0386965015d179f830724752d2e4cf4 WatchSource:0}: Error finding container ab55c0ead9213886cc4d0bd1434a30dcd0386965015d179f830724752d2e4cf4: Status 404 returned error can't find the container with id ab55c0ead9213886cc4d0bd1434a30dcd0386965015d179f830724752d2e4cf4 Oct 02 07:00:41 crc kubenswrapper[4786]: I1002 07:00:41.076150 4786 generic.go:334] "Generic (PLEG): container finished" podID="aa4cb6ec-a541-4d78-8410-331cd7066a02" containerID="87fd90869d9e4fbff99cf260989e28e00b27250d2b1614cc2327a287c318437b" exitCode=0 Oct 02 07:00:41 crc kubenswrapper[4786]: I1002 07:00:41.076306 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5002-account-create-hczhb" event={"ID":"aa4cb6ec-a541-4d78-8410-331cd7066a02","Type":"ContainerDied","Data":"87fd90869d9e4fbff99cf260989e28e00b27250d2b1614cc2327a287c318437b"} Oct 02 07:00:41 crc kubenswrapper[4786]: I1002 07:00:41.076389 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5002-account-create-hczhb" event={"ID":"aa4cb6ec-a541-4d78-8410-331cd7066a02","Type":"ContainerStarted","Data":"4869f66b99e08b2dc86d6faff68cf578a1359a9599cc0fdb66626620e5fa6dee"} Oct 02 07:00:41 crc kubenswrapper[4786]: I1002 07:00:41.078813 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" event={"ID":"0b02fae5-2435-4676-983f-f60bcd4d58a0","Type":"ContainerStarted","Data":"deac9b38b0b84db27e9114e0a72b9105f44b124b581466156bd7702ba72d917a"} Oct 02 07:00:41 crc kubenswrapper[4786]: I1002 07:00:41.078940 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:41 crc kubenswrapper[4786]: I1002 07:00:41.080291 4786 generic.go:334] "Generic (PLEG): container finished" podID="ee69903b-44e4-4424-b9ff-9b41438de93f" containerID="3cbb17a720fe7ecc8ae4be1c6c60699767ee5ccea5cb632318c976a6fd62a279" exitCode=0 Oct 02 07:00:41 crc kubenswrapper[4786]: I1002 07:00:41.080346 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9870-account-create-mcqqp" event={"ID":"ee69903b-44e4-4424-b9ff-9b41438de93f","Type":"ContainerDied","Data":"3cbb17a720fe7ecc8ae4be1c6c60699767ee5ccea5cb632318c976a6fd62a279"} Oct 02 07:00:41 crc kubenswrapper[4786]: I1002 07:00:41.080369 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9870-account-create-mcqqp" event={"ID":"ee69903b-44e4-4424-b9ff-9b41438de93f","Type":"ContainerStarted","Data":"c9b8c7052892e001def18951d662683979c5ddb40ded4ceb6218b41b7c6d83d4"} Oct 02 07:00:41 crc kubenswrapper[4786]: I1002 07:00:41.081773 4786 generic.go:334] "Generic (PLEG): container finished" podID="efc4fa25-9013-4615-867d-97e8df05d30f" containerID="58bbb8d245cc15875729b5bec0bb9dac8cfe41f891bb1f2826999da1bfab00bc" exitCode=0 Oct 02 07:00:41 crc kubenswrapper[4786]: I1002 07:00:41.081806 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-71df-account-create-8kpnq" event={"ID":"efc4fa25-9013-4615-867d-97e8df05d30f","Type":"ContainerDied","Data":"58bbb8d245cc15875729b5bec0bb9dac8cfe41f891bb1f2826999da1bfab00bc"} Oct 02 07:00:41 crc kubenswrapper[4786]: I1002 07:00:41.081821 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-71df-account-create-8kpnq" event={"ID":"efc4fa25-9013-4615-867d-97e8df05d30f","Type":"ContainerStarted","Data":"ab55c0ead9213886cc4d0bd1434a30dcd0386965015d179f830724752d2e4cf4"} Oct 02 07:00:41 crc kubenswrapper[4786]: I1002 07:00:41.107216 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" podStartSLOduration=3.107202729 podStartE2EDuration="3.107202729s" podCreationTimestamp="2025-10-02 07:00:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:00:41.106870072 +0000 UTC m=+851.228053213" watchObservedRunningTime="2025-10-02 07:00:41.107202729 +0000 UTC m=+851.228385860" Oct 02 07:00:41 crc kubenswrapper[4786]: I1002 07:00:41.465326 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-gl77n" podUID="d0b71af7-f7f4-45e8-b8f0-c7428f54a37d" containerName="ovn-controller" probeResult="failure" output=< Oct 02 07:00:41 crc kubenswrapper[4786]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 02 07:00:41 crc kubenswrapper[4786]: > Oct 02 07:00:42 crc kubenswrapper[4786]: I1002 07:00:42.372133 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5002-account-create-hczhb" Oct 02 07:00:42 crc kubenswrapper[4786]: I1002 07:00:42.448764 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9870-account-create-mcqqp" Oct 02 07:00:42 crc kubenswrapper[4786]: I1002 07:00:42.453212 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-71df-account-create-8kpnq" Oct 02 07:00:42 crc kubenswrapper[4786]: I1002 07:00:42.462579 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd6fb\" (UniqueName: \"kubernetes.io/projected/aa4cb6ec-a541-4d78-8410-331cd7066a02-kube-api-access-dd6fb\") pod \"aa4cb6ec-a541-4d78-8410-331cd7066a02\" (UID: \"aa4cb6ec-a541-4d78-8410-331cd7066a02\") " Oct 02 07:00:42 crc kubenswrapper[4786]: I1002 07:00:42.467073 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa4cb6ec-a541-4d78-8410-331cd7066a02-kube-api-access-dd6fb" (OuterVolumeSpecName: "kube-api-access-dd6fb") pod "aa4cb6ec-a541-4d78-8410-331cd7066a02" (UID: "aa4cb6ec-a541-4d78-8410-331cd7066a02"). InnerVolumeSpecName "kube-api-access-dd6fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:00:42 crc kubenswrapper[4786]: I1002 07:00:42.564032 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nxrp\" (UniqueName: \"kubernetes.io/projected/ee69903b-44e4-4424-b9ff-9b41438de93f-kube-api-access-8nxrp\") pod \"ee69903b-44e4-4424-b9ff-9b41438de93f\" (UID: \"ee69903b-44e4-4424-b9ff-9b41438de93f\") " Oct 02 07:00:42 crc kubenswrapper[4786]: I1002 07:00:42.564129 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv4rc\" (UniqueName: \"kubernetes.io/projected/efc4fa25-9013-4615-867d-97e8df05d30f-kube-api-access-rv4rc\") pod \"efc4fa25-9013-4615-867d-97e8df05d30f\" (UID: \"efc4fa25-9013-4615-867d-97e8df05d30f\") " Oct 02 07:00:42 crc kubenswrapper[4786]: I1002 07:00:42.564325 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd6fb\" (UniqueName: \"kubernetes.io/projected/aa4cb6ec-a541-4d78-8410-331cd7066a02-kube-api-access-dd6fb\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:42 crc kubenswrapper[4786]: I1002 07:00:42.566265 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee69903b-44e4-4424-b9ff-9b41438de93f-kube-api-access-8nxrp" (OuterVolumeSpecName: "kube-api-access-8nxrp") pod "ee69903b-44e4-4424-b9ff-9b41438de93f" (UID: "ee69903b-44e4-4424-b9ff-9b41438de93f"). InnerVolumeSpecName "kube-api-access-8nxrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:00:42 crc kubenswrapper[4786]: I1002 07:00:42.566903 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc4fa25-9013-4615-867d-97e8df05d30f-kube-api-access-rv4rc" (OuterVolumeSpecName: "kube-api-access-rv4rc") pod "efc4fa25-9013-4615-867d-97e8df05d30f" (UID: "efc4fa25-9013-4615-867d-97e8df05d30f"). InnerVolumeSpecName "kube-api-access-rv4rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:00:42 crc kubenswrapper[4786]: I1002 07:00:42.666185 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv4rc\" (UniqueName: \"kubernetes.io/projected/efc4fa25-9013-4615-867d-97e8df05d30f-kube-api-access-rv4rc\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:42 crc kubenswrapper[4786]: I1002 07:00:42.666206 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nxrp\" (UniqueName: \"kubernetes.io/projected/ee69903b-44e4-4424-b9ff-9b41438de93f-kube-api-access-8nxrp\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:43 crc kubenswrapper[4786]: I1002 07:00:43.093995 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5002-account-create-hczhb" event={"ID":"aa4cb6ec-a541-4d78-8410-331cd7066a02","Type":"ContainerDied","Data":"4869f66b99e08b2dc86d6faff68cf578a1359a9599cc0fdb66626620e5fa6dee"} Oct 02 07:00:43 crc kubenswrapper[4786]: I1002 07:00:43.094015 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5002-account-create-hczhb" Oct 02 07:00:43 crc kubenswrapper[4786]: I1002 07:00:43.094033 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4869f66b99e08b2dc86d6faff68cf578a1359a9599cc0fdb66626620e5fa6dee" Oct 02 07:00:43 crc kubenswrapper[4786]: I1002 07:00:43.095227 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9870-account-create-mcqqp" Oct 02 07:00:43 crc kubenswrapper[4786]: I1002 07:00:43.095243 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9870-account-create-mcqqp" event={"ID":"ee69903b-44e4-4424-b9ff-9b41438de93f","Type":"ContainerDied","Data":"c9b8c7052892e001def18951d662683979c5ddb40ded4ceb6218b41b7c6d83d4"} Oct 02 07:00:43 crc kubenswrapper[4786]: I1002 07:00:43.095259 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9b8c7052892e001def18951d662683979c5ddb40ded4ceb6218b41b7c6d83d4" Oct 02 07:00:43 crc kubenswrapper[4786]: I1002 07:00:43.096381 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-71df-account-create-8kpnq" event={"ID":"efc4fa25-9013-4615-867d-97e8df05d30f","Type":"ContainerDied","Data":"ab55c0ead9213886cc4d0bd1434a30dcd0386965015d179f830724752d2e4cf4"} Oct 02 07:00:43 crc kubenswrapper[4786]: I1002 07:00:43.096420 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab55c0ead9213886cc4d0bd1434a30dcd0386965015d179f830724752d2e4cf4" Oct 02 07:00:43 crc kubenswrapper[4786]: I1002 07:00:43.096427 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-71df-account-create-8kpnq" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.108998 4786 generic.go:334] "Generic (PLEG): container finished" podID="b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" containerID="aeadaf247e974c8121b9c913260efe601073c3a9a7d2a40863d88fcb35011d88" exitCode=0 Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.109062 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94","Type":"ContainerDied","Data":"aeadaf247e974c8121b9c913260efe601073c3a9a7d2a40863d88fcb35011d88"} Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.172487 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-kbkxq"] Oct 02 07:00:45 crc kubenswrapper[4786]: E1002 07:00:45.173074 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee69903b-44e4-4424-b9ff-9b41438de93f" containerName="mariadb-account-create" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.173094 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee69903b-44e4-4424-b9ff-9b41438de93f" containerName="mariadb-account-create" Oct 02 07:00:45 crc kubenswrapper[4786]: E1002 07:00:45.173112 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc4fa25-9013-4615-867d-97e8df05d30f" containerName="mariadb-account-create" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.173118 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc4fa25-9013-4615-867d-97e8df05d30f" containerName="mariadb-account-create" Oct 02 07:00:45 crc kubenswrapper[4786]: E1002 07:00:45.173139 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa4cb6ec-a541-4d78-8410-331cd7066a02" containerName="mariadb-account-create" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.173144 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa4cb6ec-a541-4d78-8410-331cd7066a02" containerName="mariadb-account-create" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.173290 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee69903b-44e4-4424-b9ff-9b41438de93f" containerName="mariadb-account-create" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.173319 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc4fa25-9013-4615-867d-97e8df05d30f" containerName="mariadb-account-create" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.173329 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa4cb6ec-a541-4d78-8410-331cd7066a02" containerName="mariadb-account-create" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.174474 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kbkxq" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.175784 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.175875 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8rkqb" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.179127 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-kbkxq"] Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.298664 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/205062cf-def7-4b01-b1cf-7f2e1d0ef398-db-sync-config-data\") pod \"glance-db-sync-kbkxq\" (UID: \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\") " pod="openstack/glance-db-sync-kbkxq" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.298725 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlxrz\" (UniqueName: \"kubernetes.io/projected/205062cf-def7-4b01-b1cf-7f2e1d0ef398-kube-api-access-xlxrz\") pod \"glance-db-sync-kbkxq\" (UID: \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\") " pod="openstack/glance-db-sync-kbkxq" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.298885 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205062cf-def7-4b01-b1cf-7f2e1d0ef398-config-data\") pod \"glance-db-sync-kbkxq\" (UID: \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\") " pod="openstack/glance-db-sync-kbkxq" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.299022 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205062cf-def7-4b01-b1cf-7f2e1d0ef398-combined-ca-bundle\") pod \"glance-db-sync-kbkxq\" (UID: \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\") " pod="openstack/glance-db-sync-kbkxq" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.400771 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/205062cf-def7-4b01-b1cf-7f2e1d0ef398-db-sync-config-data\") pod \"glance-db-sync-kbkxq\" (UID: \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\") " pod="openstack/glance-db-sync-kbkxq" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.400809 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlxrz\" (UniqueName: \"kubernetes.io/projected/205062cf-def7-4b01-b1cf-7f2e1d0ef398-kube-api-access-xlxrz\") pod \"glance-db-sync-kbkxq\" (UID: \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\") " pod="openstack/glance-db-sync-kbkxq" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.400889 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205062cf-def7-4b01-b1cf-7f2e1d0ef398-config-data\") pod \"glance-db-sync-kbkxq\" (UID: \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\") " pod="openstack/glance-db-sync-kbkxq" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.400956 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205062cf-def7-4b01-b1cf-7f2e1d0ef398-combined-ca-bundle\") pod \"glance-db-sync-kbkxq\" (UID: \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\") " pod="openstack/glance-db-sync-kbkxq" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.403864 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/205062cf-def7-4b01-b1cf-7f2e1d0ef398-db-sync-config-data\") pod \"glance-db-sync-kbkxq\" (UID: \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\") " pod="openstack/glance-db-sync-kbkxq" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.404611 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205062cf-def7-4b01-b1cf-7f2e1d0ef398-combined-ca-bundle\") pod \"glance-db-sync-kbkxq\" (UID: \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\") " pod="openstack/glance-db-sync-kbkxq" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.404820 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205062cf-def7-4b01-b1cf-7f2e1d0ef398-config-data\") pod \"glance-db-sync-kbkxq\" (UID: \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\") " pod="openstack/glance-db-sync-kbkxq" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.414461 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlxrz\" (UniqueName: \"kubernetes.io/projected/205062cf-def7-4b01-b1cf-7f2e1d0ef398-kube-api-access-xlxrz\") pod \"glance-db-sync-kbkxq\" (UID: \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\") " pod="openstack/glance-db-sync-kbkxq" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.535586 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kbkxq" Oct 02 07:00:45 crc kubenswrapper[4786]: I1002 07:00:45.941713 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-kbkxq"] Oct 02 07:00:45 crc kubenswrapper[4786]: W1002 07:00:45.945318 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod205062cf_def7_4b01_b1cf_7f2e1d0ef398.slice/crio-994ecdb9ee8fdb4e174a5736cc23f1a32a3003f91ed44e4127107c70d8f3d010 WatchSource:0}: Error finding container 994ecdb9ee8fdb4e174a5736cc23f1a32a3003f91ed44e4127107c70d8f3d010: Status 404 returned error can't find the container with id 994ecdb9ee8fdb4e174a5736cc23f1a32a3003f91ed44e4127107c70d8f3d010 Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.115727 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kbkxq" event={"ID":"205062cf-def7-4b01-b1cf-7f2e1d0ef398","Type":"ContainerStarted","Data":"994ecdb9ee8fdb4e174a5736cc23f1a32a3003f91ed44e4127107c70d8f3d010"} Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.117226 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94","Type":"ContainerStarted","Data":"d11020b391664126e8846d6b64d18e6efa22a39eedf2a85372e338a8359e5fbf"} Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.117417 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.135030 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.722066999 podStartE2EDuration="1m41.135017019s" podCreationTimestamp="2025-10-02 06:59:05 +0000 UTC" firstStartedPulling="2025-10-02 06:59:06.892661228 +0000 UTC m=+757.013844359" lastFinishedPulling="2025-10-02 07:00:11.305611248 +0000 UTC m=+821.426794379" observedRunningTime="2025-10-02 07:00:46.131015389 +0000 UTC m=+856.252198540" watchObservedRunningTime="2025-10-02 07:00:46.135017019 +0000 UTC m=+856.256200150" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.278852 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.476517 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-gl77n" podUID="d0b71af7-f7f4-45e8-b8f0-c7428f54a37d" containerName="ovn-controller" probeResult="failure" output=< Oct 02 07:00:46 crc kubenswrapper[4786]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 02 07:00:46 crc kubenswrapper[4786]: > Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.520769 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-78h8k"] Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.521605 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-78h8k" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.531225 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-78h8k"] Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.574192 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.577213 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-lf7tj" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.616215 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55mgm\" (UniqueName: \"kubernetes.io/projected/793c01ad-7938-4d1b-913d-5bfa2e48b721-kube-api-access-55mgm\") pod \"cinder-db-create-78h8k\" (UID: \"793c01ad-7938-4d1b-913d-5bfa2e48b721\") " pod="openstack/cinder-db-create-78h8k" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.633317 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-tq8qn"] Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.634152 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tq8qn" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.640068 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-tq8qn"] Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.717976 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thfq6\" (UniqueName: \"kubernetes.io/projected/24f24c17-9405-445e-9f7b-9bb550347912-kube-api-access-thfq6\") pod \"barbican-db-create-tq8qn\" (UID: \"24f24c17-9405-445e-9f7b-9bb550347912\") " pod="openstack/barbican-db-create-tq8qn" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.718136 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55mgm\" (UniqueName: \"kubernetes.io/projected/793c01ad-7938-4d1b-913d-5bfa2e48b721-kube-api-access-55mgm\") pod \"cinder-db-create-78h8k\" (UID: \"793c01ad-7938-4d1b-913d-5bfa2e48b721\") " pod="openstack/cinder-db-create-78h8k" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.733112 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55mgm\" (UniqueName: \"kubernetes.io/projected/793c01ad-7938-4d1b-913d-5bfa2e48b721-kube-api-access-55mgm\") pod \"cinder-db-create-78h8k\" (UID: \"793c01ad-7938-4d1b-913d-5bfa2e48b721\") " pod="openstack/cinder-db-create-78h8k" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.768635 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-sz2xw"] Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.769497 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-sz2xw" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.770750 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v5blg" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.772325 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.772325 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.772543 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.778256 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-sz2xw"] Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.802900 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-gl77n-config-sdxs9"] Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.803776 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.805544 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.811360 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gl77n-config-sdxs9"] Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.819651 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce72c1de-6645-43e2-88bf-11f40a02d16c-combined-ca-bundle\") pod \"keystone-db-sync-sz2xw\" (UID: \"ce72c1de-6645-43e2-88bf-11f40a02d16c\") " pod="openstack/keystone-db-sync-sz2xw" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.819707 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snbjv\" (UniqueName: \"kubernetes.io/projected/ce72c1de-6645-43e2-88bf-11f40a02d16c-kube-api-access-snbjv\") pod \"keystone-db-sync-sz2xw\" (UID: \"ce72c1de-6645-43e2-88bf-11f40a02d16c\") " pod="openstack/keystone-db-sync-sz2xw" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.819763 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thfq6\" (UniqueName: \"kubernetes.io/projected/24f24c17-9405-445e-9f7b-9bb550347912-kube-api-access-thfq6\") pod \"barbican-db-create-tq8qn\" (UID: \"24f24c17-9405-445e-9f7b-9bb550347912\") " pod="openstack/barbican-db-create-tq8qn" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.819880 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce72c1de-6645-43e2-88bf-11f40a02d16c-config-data\") pod \"keystone-db-sync-sz2xw\" (UID: \"ce72c1de-6645-43e2-88bf-11f40a02d16c\") " pod="openstack/keystone-db-sync-sz2xw" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.837206 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thfq6\" (UniqueName: \"kubernetes.io/projected/24f24c17-9405-445e-9f7b-9bb550347912-kube-api-access-thfq6\") pod \"barbican-db-create-tq8qn\" (UID: \"24f24c17-9405-445e-9f7b-9bb550347912\") " pod="openstack/barbican-db-create-tq8qn" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.849993 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-78h8k" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.921591 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8544053f-c2a6-47a5-a592-36037f9544a4-var-log-ovn\") pod \"ovn-controller-gl77n-config-sdxs9\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.921841 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8544053f-c2a6-47a5-a592-36037f9544a4-var-run-ovn\") pod \"ovn-controller-gl77n-config-sdxs9\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.921871 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce72c1de-6645-43e2-88bf-11f40a02d16c-config-data\") pod \"keystone-db-sync-sz2xw\" (UID: \"ce72c1de-6645-43e2-88bf-11f40a02d16c\") " pod="openstack/keystone-db-sync-sz2xw" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.921891 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8544053f-c2a6-47a5-a592-36037f9544a4-scripts\") pod \"ovn-controller-gl77n-config-sdxs9\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.921946 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97z8d\" (UniqueName: \"kubernetes.io/projected/8544053f-c2a6-47a5-a592-36037f9544a4-kube-api-access-97z8d\") pod \"ovn-controller-gl77n-config-sdxs9\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.921979 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8544053f-c2a6-47a5-a592-36037f9544a4-additional-scripts\") pod \"ovn-controller-gl77n-config-sdxs9\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.922007 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce72c1de-6645-43e2-88bf-11f40a02d16c-combined-ca-bundle\") pod \"keystone-db-sync-sz2xw\" (UID: \"ce72c1de-6645-43e2-88bf-11f40a02d16c\") " pod="openstack/keystone-db-sync-sz2xw" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.922032 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snbjv\" (UniqueName: \"kubernetes.io/projected/ce72c1de-6645-43e2-88bf-11f40a02d16c-kube-api-access-snbjv\") pod \"keystone-db-sync-sz2xw\" (UID: \"ce72c1de-6645-43e2-88bf-11f40a02d16c\") " pod="openstack/keystone-db-sync-sz2xw" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.922053 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8544053f-c2a6-47a5-a592-36037f9544a4-var-run\") pod \"ovn-controller-gl77n-config-sdxs9\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.926236 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce72c1de-6645-43e2-88bf-11f40a02d16c-combined-ca-bundle\") pod \"keystone-db-sync-sz2xw\" (UID: \"ce72c1de-6645-43e2-88bf-11f40a02d16c\") " pod="openstack/keystone-db-sync-sz2xw" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.926838 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce72c1de-6645-43e2-88bf-11f40a02d16c-config-data\") pod \"keystone-db-sync-sz2xw\" (UID: \"ce72c1de-6645-43e2-88bf-11f40a02d16c\") " pod="openstack/keystone-db-sync-sz2xw" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.940050 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-mgr56"] Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.940881 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mgr56" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.946019 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tq8qn" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.946585 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snbjv\" (UniqueName: \"kubernetes.io/projected/ce72c1de-6645-43e2-88bf-11f40a02d16c-kube-api-access-snbjv\") pod \"keystone-db-sync-sz2xw\" (UID: \"ce72c1de-6645-43e2-88bf-11f40a02d16c\") " pod="openstack/keystone-db-sync-sz2xw" Oct 02 07:00:46 crc kubenswrapper[4786]: I1002 07:00:46.948496 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mgr56"] Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.023165 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8544053f-c2a6-47a5-a592-36037f9544a4-var-run-ovn\") pod \"ovn-controller-gl77n-config-sdxs9\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.023203 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8544053f-c2a6-47a5-a592-36037f9544a4-scripts\") pod \"ovn-controller-gl77n-config-sdxs9\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.023251 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97z8d\" (UniqueName: \"kubernetes.io/projected/8544053f-c2a6-47a5-a592-36037f9544a4-kube-api-access-97z8d\") pod \"ovn-controller-gl77n-config-sdxs9\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.023283 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8544053f-c2a6-47a5-a592-36037f9544a4-additional-scripts\") pod \"ovn-controller-gl77n-config-sdxs9\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.023320 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8544053f-c2a6-47a5-a592-36037f9544a4-var-run\") pod \"ovn-controller-gl77n-config-sdxs9\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.023367 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8544053f-c2a6-47a5-a592-36037f9544a4-var-log-ovn\") pod \"ovn-controller-gl77n-config-sdxs9\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.023425 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ss7h\" (UniqueName: \"kubernetes.io/projected/d12a88ec-24f7-415c-945e-21884b53b156-kube-api-access-4ss7h\") pod \"neutron-db-create-mgr56\" (UID: \"d12a88ec-24f7-415c-945e-21884b53b156\") " pod="openstack/neutron-db-create-mgr56" Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.023637 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8544053f-c2a6-47a5-a592-36037f9544a4-var-run-ovn\") pod \"ovn-controller-gl77n-config-sdxs9\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.023669 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8544053f-c2a6-47a5-a592-36037f9544a4-var-log-ovn\") pod \"ovn-controller-gl77n-config-sdxs9\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.023699 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8544053f-c2a6-47a5-a592-36037f9544a4-var-run\") pod \"ovn-controller-gl77n-config-sdxs9\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.024281 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8544053f-c2a6-47a5-a592-36037f9544a4-additional-scripts\") pod \"ovn-controller-gl77n-config-sdxs9\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.025282 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8544053f-c2a6-47a5-a592-36037f9544a4-scripts\") pod \"ovn-controller-gl77n-config-sdxs9\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.037204 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97z8d\" (UniqueName: \"kubernetes.io/projected/8544053f-c2a6-47a5-a592-36037f9544a4-kube-api-access-97z8d\") pod \"ovn-controller-gl77n-config-sdxs9\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.081985 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-sz2xw" Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.118097 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.124196 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ss7h\" (UniqueName: \"kubernetes.io/projected/d12a88ec-24f7-415c-945e-21884b53b156-kube-api-access-4ss7h\") pod \"neutron-db-create-mgr56\" (UID: \"d12a88ec-24f7-415c-945e-21884b53b156\") " pod="openstack/neutron-db-create-mgr56" Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.140283 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ss7h\" (UniqueName: \"kubernetes.io/projected/d12a88ec-24f7-415c-945e-21884b53b156-kube-api-access-4ss7h\") pod \"neutron-db-create-mgr56\" (UID: \"d12a88ec-24f7-415c-945e-21884b53b156\") " pod="openstack/neutron-db-create-mgr56" Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.260576 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-78h8k"] Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.261983 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mgr56" Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.370141 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-tq8qn"] Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.505366 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-sz2xw"] Oct 02 07:00:47 crc kubenswrapper[4786]: W1002 07:00:47.510784 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce72c1de_6645_43e2_88bf_11f40a02d16c.slice/crio-b68a32fc787cf254118d6cc7b0726d1de084f7593af0d77d0cdc111423bcfc35 WatchSource:0}: Error finding container b68a32fc787cf254118d6cc7b0726d1de084f7593af0d77d0cdc111423bcfc35: Status 404 returned error can't find the container with id b68a32fc787cf254118d6cc7b0726d1de084f7593af0d77d0cdc111423bcfc35 Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.573523 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gl77n-config-sdxs9"] Oct 02 07:00:47 crc kubenswrapper[4786]: W1002 07:00:47.617664 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8544053f_c2a6_47a5_a592_36037f9544a4.slice/crio-71f670b14bbb5d0566f8a9156f52c1c59e61f39c026a1f641c7ec37425b26da5 WatchSource:0}: Error finding container 71f670b14bbb5d0566f8a9156f52c1c59e61f39c026a1f641c7ec37425b26da5: Status 404 returned error can't find the container with id 71f670b14bbb5d0566f8a9156f52c1c59e61f39c026a1f641c7ec37425b26da5 Oct 02 07:00:47 crc kubenswrapper[4786]: I1002 07:00:47.694056 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mgr56"] Oct 02 07:00:48 crc kubenswrapper[4786]: I1002 07:00:48.135447 4786 generic.go:334] "Generic (PLEG): container finished" podID="24f24c17-9405-445e-9f7b-9bb550347912" containerID="6038acf2fde2f928a0de0cf0a88fe4d098677f8e301447053373f6194a309e6c" exitCode=0 Oct 02 07:00:48 crc kubenswrapper[4786]: I1002 07:00:48.135637 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-tq8qn" event={"ID":"24f24c17-9405-445e-9f7b-9bb550347912","Type":"ContainerDied","Data":"6038acf2fde2f928a0de0cf0a88fe4d098677f8e301447053373f6194a309e6c"} Oct 02 07:00:48 crc kubenswrapper[4786]: I1002 07:00:48.135661 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-tq8qn" event={"ID":"24f24c17-9405-445e-9f7b-9bb550347912","Type":"ContainerStarted","Data":"feb4c7c5745b6aa4fc404a3e8163ce8278f1ce19379812d9328273677e8a2fc9"} Oct 02 07:00:48 crc kubenswrapper[4786]: I1002 07:00:48.137800 4786 generic.go:334] "Generic (PLEG): container finished" podID="793c01ad-7938-4d1b-913d-5bfa2e48b721" containerID="6909801ba952a1df67fefdef25050a9855909b9742b17e77bcb85ab885236d9a" exitCode=0 Oct 02 07:00:48 crc kubenswrapper[4786]: I1002 07:00:48.137845 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-78h8k" event={"ID":"793c01ad-7938-4d1b-913d-5bfa2e48b721","Type":"ContainerDied","Data":"6909801ba952a1df67fefdef25050a9855909b9742b17e77bcb85ab885236d9a"} Oct 02 07:00:48 crc kubenswrapper[4786]: I1002 07:00:48.137862 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-78h8k" event={"ID":"793c01ad-7938-4d1b-913d-5bfa2e48b721","Type":"ContainerStarted","Data":"79d5a7a682ebb7f487dc02e1d8659f83bbfbb2c63554965f6e8da2f8d0da1a70"} Oct 02 07:00:48 crc kubenswrapper[4786]: I1002 07:00:48.139013 4786 generic.go:334] "Generic (PLEG): container finished" podID="8544053f-c2a6-47a5-a592-36037f9544a4" containerID="da6420d39a5a07d64561c400d6cad60810b13256e8b3defafa02bf67865c0716" exitCode=0 Oct 02 07:00:48 crc kubenswrapper[4786]: I1002 07:00:48.139046 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gl77n-config-sdxs9" event={"ID":"8544053f-c2a6-47a5-a592-36037f9544a4","Type":"ContainerDied","Data":"da6420d39a5a07d64561c400d6cad60810b13256e8b3defafa02bf67865c0716"} Oct 02 07:00:48 crc kubenswrapper[4786]: I1002 07:00:48.139060 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gl77n-config-sdxs9" event={"ID":"8544053f-c2a6-47a5-a592-36037f9544a4","Type":"ContainerStarted","Data":"71f670b14bbb5d0566f8a9156f52c1c59e61f39c026a1f641c7ec37425b26da5"} Oct 02 07:00:48 crc kubenswrapper[4786]: I1002 07:00:48.140467 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-sz2xw" event={"ID":"ce72c1de-6645-43e2-88bf-11f40a02d16c","Type":"ContainerStarted","Data":"b68a32fc787cf254118d6cc7b0726d1de084f7593af0d77d0cdc111423bcfc35"} Oct 02 07:00:48 crc kubenswrapper[4786]: I1002 07:00:48.141646 4786 generic.go:334] "Generic (PLEG): container finished" podID="d12a88ec-24f7-415c-945e-21884b53b156" containerID="ba5418438a4f7505b4c9a04aea6183e45dc55c96f799083df04b85b6faaa645f" exitCode=0 Oct 02 07:00:48 crc kubenswrapper[4786]: I1002 07:00:48.141673 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mgr56" event={"ID":"d12a88ec-24f7-415c-945e-21884b53b156","Type":"ContainerDied","Data":"ba5418438a4f7505b4c9a04aea6183e45dc55c96f799083df04b85b6faaa645f"} Oct 02 07:00:48 crc kubenswrapper[4786]: I1002 07:00:48.141709 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mgr56" event={"ID":"d12a88ec-24f7-415c-945e-21884b53b156","Type":"ContainerStarted","Data":"6697d55aeb4bee5e22d08e9e768499841f30e674ae685237a8bbbf196966b7cf"} Oct 02 07:00:48 crc kubenswrapper[4786]: I1002 07:00:48.597088 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:00:48 crc kubenswrapper[4786]: I1002 07:00:48.633685 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78586fdbff-kkx7p"] Oct 02 07:00:48 crc kubenswrapper[4786]: I1002 07:00:48.633891 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" podUID="b03cd4b9-a66a-4a03-9915-88fe79682756" containerName="dnsmasq-dns" containerID="cri-o://978d522489e0ca9091396050737187859bdcda14a7978f0915d297570da9e0cd" gracePeriod=10 Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.091194 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.149120 4786 generic.go:334] "Generic (PLEG): container finished" podID="b03cd4b9-a66a-4a03-9915-88fe79682756" containerID="978d522489e0ca9091396050737187859bdcda14a7978f0915d297570da9e0cd" exitCode=0 Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.149142 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" event={"ID":"b03cd4b9-a66a-4a03-9915-88fe79682756","Type":"ContainerDied","Data":"978d522489e0ca9091396050737187859bdcda14a7978f0915d297570da9e0cd"} Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.149168 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.149174 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78586fdbff-kkx7p" event={"ID":"b03cd4b9-a66a-4a03-9915-88fe79682756","Type":"ContainerDied","Data":"5aa4432afde477dd048a128391e2ce4001d982a43920192413f5e56d5d70a491"} Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.149190 4786 scope.go:117] "RemoveContainer" containerID="978d522489e0ca9091396050737187859bdcda14a7978f0915d297570da9e0cd" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.184237 4786 scope.go:117] "RemoveContainer" containerID="bd11e9dc6bee9977aa1b7550185798138f2c971138849ee2678dd873d5f34cfd" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.214633 4786 scope.go:117] "RemoveContainer" containerID="978d522489e0ca9091396050737187859bdcda14a7978f0915d297570da9e0cd" Oct 02 07:00:49 crc kubenswrapper[4786]: E1002 07:00:49.214978 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"978d522489e0ca9091396050737187859bdcda14a7978f0915d297570da9e0cd\": container with ID starting with 978d522489e0ca9091396050737187859bdcda14a7978f0915d297570da9e0cd not found: ID does not exist" containerID="978d522489e0ca9091396050737187859bdcda14a7978f0915d297570da9e0cd" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.215005 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"978d522489e0ca9091396050737187859bdcda14a7978f0915d297570da9e0cd"} err="failed to get container status \"978d522489e0ca9091396050737187859bdcda14a7978f0915d297570da9e0cd\": rpc error: code = NotFound desc = could not find container \"978d522489e0ca9091396050737187859bdcda14a7978f0915d297570da9e0cd\": container with ID starting with 978d522489e0ca9091396050737187859bdcda14a7978f0915d297570da9e0cd not found: ID does not exist" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.215033 4786 scope.go:117] "RemoveContainer" containerID="bd11e9dc6bee9977aa1b7550185798138f2c971138849ee2678dd873d5f34cfd" Oct 02 07:00:49 crc kubenswrapper[4786]: E1002 07:00:49.215273 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd11e9dc6bee9977aa1b7550185798138f2c971138849ee2678dd873d5f34cfd\": container with ID starting with bd11e9dc6bee9977aa1b7550185798138f2c971138849ee2678dd873d5f34cfd not found: ID does not exist" containerID="bd11e9dc6bee9977aa1b7550185798138f2c971138849ee2678dd873d5f34cfd" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.215289 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd11e9dc6bee9977aa1b7550185798138f2c971138849ee2678dd873d5f34cfd"} err="failed to get container status \"bd11e9dc6bee9977aa1b7550185798138f2c971138849ee2678dd873d5f34cfd\": rpc error: code = NotFound desc = could not find container \"bd11e9dc6bee9977aa1b7550185798138f2c971138849ee2678dd873d5f34cfd\": container with ID starting with bd11e9dc6bee9977aa1b7550185798138f2c971138849ee2678dd873d5f34cfd not found: ID does not exist" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.261161 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-ovsdbserver-sb\") pod \"b03cd4b9-a66a-4a03-9915-88fe79682756\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.261200 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-config\") pod \"b03cd4b9-a66a-4a03-9915-88fe79682756\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.261234 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7tjz\" (UniqueName: \"kubernetes.io/projected/b03cd4b9-a66a-4a03-9915-88fe79682756-kube-api-access-l7tjz\") pod \"b03cd4b9-a66a-4a03-9915-88fe79682756\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.261309 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-ovsdbserver-nb\") pod \"b03cd4b9-a66a-4a03-9915-88fe79682756\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.261460 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-dns-svc\") pod \"b03cd4b9-a66a-4a03-9915-88fe79682756\" (UID: \"b03cd4b9-a66a-4a03-9915-88fe79682756\") " Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.277556 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b03cd4b9-a66a-4a03-9915-88fe79682756-kube-api-access-l7tjz" (OuterVolumeSpecName: "kube-api-access-l7tjz") pod "b03cd4b9-a66a-4a03-9915-88fe79682756" (UID: "b03cd4b9-a66a-4a03-9915-88fe79682756"). InnerVolumeSpecName "kube-api-access-l7tjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.294610 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b03cd4b9-a66a-4a03-9915-88fe79682756" (UID: "b03cd4b9-a66a-4a03-9915-88fe79682756"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.300396 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-config" (OuterVolumeSpecName: "config") pod "b03cd4b9-a66a-4a03-9915-88fe79682756" (UID: "b03cd4b9-a66a-4a03-9915-88fe79682756"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.303719 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b03cd4b9-a66a-4a03-9915-88fe79682756" (UID: "b03cd4b9-a66a-4a03-9915-88fe79682756"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.307818 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b03cd4b9-a66a-4a03-9915-88fe79682756" (UID: "b03cd4b9-a66a-4a03-9915-88fe79682756"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.363165 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.363190 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.363200 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7tjz\" (UniqueName: \"kubernetes.io/projected/b03cd4b9-a66a-4a03-9915-88fe79682756-kube-api-access-l7tjz\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.363210 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.363218 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b03cd4b9-a66a-4a03-9915-88fe79682756-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.441875 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tq8qn" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.479825 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78586fdbff-kkx7p"] Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.484458 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78586fdbff-kkx7p"] Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.521321 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mgr56" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.536049 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.555016 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-78h8k" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.566105 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thfq6\" (UniqueName: \"kubernetes.io/projected/24f24c17-9405-445e-9f7b-9bb550347912-kube-api-access-thfq6\") pod \"24f24c17-9405-445e-9f7b-9bb550347912\" (UID: \"24f24c17-9405-445e-9f7b-9bb550347912\") " Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.577519 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24f24c17-9405-445e-9f7b-9bb550347912-kube-api-access-thfq6" (OuterVolumeSpecName: "kube-api-access-thfq6") pod "24f24c17-9405-445e-9f7b-9bb550347912" (UID: "24f24c17-9405-445e-9f7b-9bb550347912"). InnerVolumeSpecName "kube-api-access-thfq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.669105 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97z8d\" (UniqueName: \"kubernetes.io/projected/8544053f-c2a6-47a5-a592-36037f9544a4-kube-api-access-97z8d\") pod \"8544053f-c2a6-47a5-a592-36037f9544a4\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.669170 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55mgm\" (UniqueName: \"kubernetes.io/projected/793c01ad-7938-4d1b-913d-5bfa2e48b721-kube-api-access-55mgm\") pod \"793c01ad-7938-4d1b-913d-5bfa2e48b721\" (UID: \"793c01ad-7938-4d1b-913d-5bfa2e48b721\") " Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.669259 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8544053f-c2a6-47a5-a592-36037f9544a4-var-run-ovn\") pod \"8544053f-c2a6-47a5-a592-36037f9544a4\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.669344 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8544053f-c2a6-47a5-a592-36037f9544a4-additional-scripts\") pod \"8544053f-c2a6-47a5-a592-36037f9544a4\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.669376 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8544053f-c2a6-47a5-a592-36037f9544a4-scripts\") pod \"8544053f-c2a6-47a5-a592-36037f9544a4\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.669411 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8544053f-c2a6-47a5-a592-36037f9544a4-var-run\") pod \"8544053f-c2a6-47a5-a592-36037f9544a4\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.669448 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ss7h\" (UniqueName: \"kubernetes.io/projected/d12a88ec-24f7-415c-945e-21884b53b156-kube-api-access-4ss7h\") pod \"d12a88ec-24f7-415c-945e-21884b53b156\" (UID: \"d12a88ec-24f7-415c-945e-21884b53b156\") " Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.669460 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8544053f-c2a6-47a5-a592-36037f9544a4-var-log-ovn\") pod \"8544053f-c2a6-47a5-a592-36037f9544a4\" (UID: \"8544053f-c2a6-47a5-a592-36037f9544a4\") " Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.669874 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thfq6\" (UniqueName: \"kubernetes.io/projected/24f24c17-9405-445e-9f7b-9bb550347912-kube-api-access-thfq6\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.669916 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8544053f-c2a6-47a5-a592-36037f9544a4-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8544053f-c2a6-47a5-a592-36037f9544a4" (UID: "8544053f-c2a6-47a5-a592-36037f9544a4"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.670121 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8544053f-c2a6-47a5-a592-36037f9544a4-var-run" (OuterVolumeSpecName: "var-run") pod "8544053f-c2a6-47a5-a592-36037f9544a4" (UID: "8544053f-c2a6-47a5-a592-36037f9544a4"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.670240 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8544053f-c2a6-47a5-a592-36037f9544a4-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8544053f-c2a6-47a5-a592-36037f9544a4" (UID: "8544053f-c2a6-47a5-a592-36037f9544a4"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.670594 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8544053f-c2a6-47a5-a592-36037f9544a4-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8544053f-c2a6-47a5-a592-36037f9544a4" (UID: "8544053f-c2a6-47a5-a592-36037f9544a4"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.672160 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8544053f-c2a6-47a5-a592-36037f9544a4-scripts" (OuterVolumeSpecName: "scripts") pod "8544053f-c2a6-47a5-a592-36037f9544a4" (UID: "8544053f-c2a6-47a5-a592-36037f9544a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.673127 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d12a88ec-24f7-415c-945e-21884b53b156-kube-api-access-4ss7h" (OuterVolumeSpecName: "kube-api-access-4ss7h") pod "d12a88ec-24f7-415c-945e-21884b53b156" (UID: "d12a88ec-24f7-415c-945e-21884b53b156"). InnerVolumeSpecName "kube-api-access-4ss7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.673576 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/793c01ad-7938-4d1b-913d-5bfa2e48b721-kube-api-access-55mgm" (OuterVolumeSpecName: "kube-api-access-55mgm") pod "793c01ad-7938-4d1b-913d-5bfa2e48b721" (UID: "793c01ad-7938-4d1b-913d-5bfa2e48b721"). InnerVolumeSpecName "kube-api-access-55mgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.673654 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8544053f-c2a6-47a5-a592-36037f9544a4-kube-api-access-97z8d" (OuterVolumeSpecName: "kube-api-access-97z8d") pod "8544053f-c2a6-47a5-a592-36037f9544a4" (UID: "8544053f-c2a6-47a5-a592-36037f9544a4"). InnerVolumeSpecName "kube-api-access-97z8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.772314 4786 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8544053f-c2a6-47a5-a592-36037f9544a4-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.772343 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8544053f-c2a6-47a5-a592-36037f9544a4-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.772354 4786 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8544053f-c2a6-47a5-a592-36037f9544a4-var-run\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.772362 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ss7h\" (UniqueName: \"kubernetes.io/projected/d12a88ec-24f7-415c-945e-21884b53b156-kube-api-access-4ss7h\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.772372 4786 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8544053f-c2a6-47a5-a592-36037f9544a4-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.772379 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97z8d\" (UniqueName: \"kubernetes.io/projected/8544053f-c2a6-47a5-a592-36037f9544a4-kube-api-access-97z8d\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.772389 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55mgm\" (UniqueName: \"kubernetes.io/projected/793c01ad-7938-4d1b-913d-5bfa2e48b721-kube-api-access-55mgm\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:49 crc kubenswrapper[4786]: I1002 07:00:49.772397 4786 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8544053f-c2a6-47a5-a592-36037f9544a4-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 07:00:50 crc kubenswrapper[4786]: I1002 07:00:50.158596 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-tq8qn" event={"ID":"24f24c17-9405-445e-9f7b-9bb550347912","Type":"ContainerDied","Data":"feb4c7c5745b6aa4fc404a3e8163ce8278f1ce19379812d9328273677e8a2fc9"} Oct 02 07:00:50 crc kubenswrapper[4786]: I1002 07:00:50.158990 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="feb4c7c5745b6aa4fc404a3e8163ce8278f1ce19379812d9328273677e8a2fc9" Oct 02 07:00:50 crc kubenswrapper[4786]: I1002 07:00:50.158611 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-tq8qn" Oct 02 07:00:50 crc kubenswrapper[4786]: I1002 07:00:50.160985 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-78h8k" event={"ID":"793c01ad-7938-4d1b-913d-5bfa2e48b721","Type":"ContainerDied","Data":"79d5a7a682ebb7f487dc02e1d8659f83bbfbb2c63554965f6e8da2f8d0da1a70"} Oct 02 07:00:50 crc kubenswrapper[4786]: I1002 07:00:50.161030 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79d5a7a682ebb7f487dc02e1d8659f83bbfbb2c63554965f6e8da2f8d0da1a70" Oct 02 07:00:50 crc kubenswrapper[4786]: I1002 07:00:50.161100 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-78h8k" Oct 02 07:00:50 crc kubenswrapper[4786]: I1002 07:00:50.162851 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gl77n-config-sdxs9" Oct 02 07:00:50 crc kubenswrapper[4786]: I1002 07:00:50.162859 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gl77n-config-sdxs9" event={"ID":"8544053f-c2a6-47a5-a592-36037f9544a4","Type":"ContainerDied","Data":"71f670b14bbb5d0566f8a9156f52c1c59e61f39c026a1f641c7ec37425b26da5"} Oct 02 07:00:50 crc kubenswrapper[4786]: I1002 07:00:50.162971 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71f670b14bbb5d0566f8a9156f52c1c59e61f39c026a1f641c7ec37425b26da5" Oct 02 07:00:50 crc kubenswrapper[4786]: I1002 07:00:50.166289 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mgr56" event={"ID":"d12a88ec-24f7-415c-945e-21884b53b156","Type":"ContainerDied","Data":"6697d55aeb4bee5e22d08e9e768499841f30e674ae685237a8bbbf196966b7cf"} Oct 02 07:00:50 crc kubenswrapper[4786]: I1002 07:00:50.166292 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mgr56" Oct 02 07:00:50 crc kubenswrapper[4786]: I1002 07:00:50.166315 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6697d55aeb4bee5e22d08e9e768499841f30e674ae685237a8bbbf196966b7cf" Oct 02 07:00:50 crc kubenswrapper[4786]: I1002 07:00:50.187440 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b03cd4b9-a66a-4a03-9915-88fe79682756" path="/var/lib/kubelet/pods/b03cd4b9-a66a-4a03-9915-88fe79682756/volumes" Oct 02 07:00:50 crc kubenswrapper[4786]: I1002 07:00:50.614884 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-gl77n-config-sdxs9"] Oct 02 07:00:50 crc kubenswrapper[4786]: I1002 07:00:50.623746 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-gl77n-config-sdxs9"] Oct 02 07:00:51 crc kubenswrapper[4786]: I1002 07:00:51.531370 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-gl77n" Oct 02 07:00:52 crc kubenswrapper[4786]: I1002 07:00:52.187460 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8544053f-c2a6-47a5-a592-36037f9544a4" path="/var/lib/kubelet/pods/8544053f-c2a6-47a5-a592-36037f9544a4/volumes" Oct 02 07:00:53 crc kubenswrapper[4786]: I1002 07:00:53.201169 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-sz2xw" event={"ID":"ce72c1de-6645-43e2-88bf-11f40a02d16c","Type":"ContainerStarted","Data":"68819bd3851a2c4cd97a8f88ce5331a658e7d9c9c14dced99463f14f11a8fe90"} Oct 02 07:00:53 crc kubenswrapper[4786]: I1002 07:00:53.218869 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-sz2xw" podStartSLOduration=2.1004163 podStartE2EDuration="7.21885613s" podCreationTimestamp="2025-10-02 07:00:46 +0000 UTC" firstStartedPulling="2025-10-02 07:00:47.513312902 +0000 UTC m=+857.634496034" lastFinishedPulling="2025-10-02 07:00:52.631752732 +0000 UTC m=+862.752935864" observedRunningTime="2025-10-02 07:00:53.21373386 +0000 UTC m=+863.334917001" watchObservedRunningTime="2025-10-02 07:00:53.21885613 +0000 UTC m=+863.340039261" Oct 02 07:00:55 crc kubenswrapper[4786]: I1002 07:00:55.213426 4786 generic.go:334] "Generic (PLEG): container finished" podID="ce72c1de-6645-43e2-88bf-11f40a02d16c" containerID="68819bd3851a2c4cd97a8f88ce5331a658e7d9c9c14dced99463f14f11a8fe90" exitCode=0 Oct 02 07:00:55 crc kubenswrapper[4786]: I1002 07:00:55.213461 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-sz2xw" event={"ID":"ce72c1de-6645-43e2-88bf-11f40a02d16c","Type":"ContainerDied","Data":"68819bd3851a2c4cd97a8f88ce5331a658e7d9c9c14dced99463f14f11a8fe90"} Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.518865 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.642586 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0a22-account-create-qrrp9"] Oct 02 07:00:56 crc kubenswrapper[4786]: E1002 07:00:56.642928 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8544053f-c2a6-47a5-a592-36037f9544a4" containerName="ovn-config" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.642946 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8544053f-c2a6-47a5-a592-36037f9544a4" containerName="ovn-config" Oct 02 07:00:56 crc kubenswrapper[4786]: E1002 07:00:56.642969 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="793c01ad-7938-4d1b-913d-5bfa2e48b721" containerName="mariadb-database-create" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.642975 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="793c01ad-7938-4d1b-913d-5bfa2e48b721" containerName="mariadb-database-create" Oct 02 07:00:56 crc kubenswrapper[4786]: E1002 07:00:56.642984 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03cd4b9-a66a-4a03-9915-88fe79682756" containerName="init" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.642989 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03cd4b9-a66a-4a03-9915-88fe79682756" containerName="init" Oct 02 07:00:56 crc kubenswrapper[4786]: E1002 07:00:56.642998 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24f24c17-9405-445e-9f7b-9bb550347912" containerName="mariadb-database-create" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.643004 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="24f24c17-9405-445e-9f7b-9bb550347912" containerName="mariadb-database-create" Oct 02 07:00:56 crc kubenswrapper[4786]: E1002 07:00:56.643012 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03cd4b9-a66a-4a03-9915-88fe79682756" containerName="dnsmasq-dns" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.643017 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03cd4b9-a66a-4a03-9915-88fe79682756" containerName="dnsmasq-dns" Oct 02 07:00:56 crc kubenswrapper[4786]: E1002 07:00:56.643024 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d12a88ec-24f7-415c-945e-21884b53b156" containerName="mariadb-database-create" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.643030 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d12a88ec-24f7-415c-945e-21884b53b156" containerName="mariadb-database-create" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.643184 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d12a88ec-24f7-415c-945e-21884b53b156" containerName="mariadb-database-create" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.643195 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8544053f-c2a6-47a5-a592-36037f9544a4" containerName="ovn-config" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.643203 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="24f24c17-9405-445e-9f7b-9bb550347912" containerName="mariadb-database-create" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.643210 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="793c01ad-7938-4d1b-913d-5bfa2e48b721" containerName="mariadb-database-create" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.643221 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b03cd4b9-a66a-4a03-9915-88fe79682756" containerName="dnsmasq-dns" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.643664 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0a22-account-create-qrrp9" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.645625 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.656102 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0a22-account-create-qrrp9"] Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.744330 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-149f-account-create-lxndm"] Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.745228 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-149f-account-create-lxndm" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.752514 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-149f-account-create-lxndm"] Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.752772 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.766086 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js58z\" (UniqueName: \"kubernetes.io/projected/ed6d6948-364a-4b79-ba0b-7ea0645e36e4-kube-api-access-js58z\") pod \"barbican-0a22-account-create-qrrp9\" (UID: \"ed6d6948-364a-4b79-ba0b-7ea0645e36e4\") " pod="openstack/barbican-0a22-account-create-qrrp9" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.867746 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js58z\" (UniqueName: \"kubernetes.io/projected/ed6d6948-364a-4b79-ba0b-7ea0645e36e4-kube-api-access-js58z\") pod \"barbican-0a22-account-create-qrrp9\" (UID: \"ed6d6948-364a-4b79-ba0b-7ea0645e36e4\") " pod="openstack/barbican-0a22-account-create-qrrp9" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.867968 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p9fd\" (UniqueName: \"kubernetes.io/projected/468d7644-ca17-43aa-88b5-f4917044b91f-kube-api-access-2p9fd\") pod \"cinder-149f-account-create-lxndm\" (UID: \"468d7644-ca17-43aa-88b5-f4917044b91f\") " pod="openstack/cinder-149f-account-create-lxndm" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.885541 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js58z\" (UniqueName: \"kubernetes.io/projected/ed6d6948-364a-4b79-ba0b-7ea0645e36e4-kube-api-access-js58z\") pod \"barbican-0a22-account-create-qrrp9\" (UID: \"ed6d6948-364a-4b79-ba0b-7ea0645e36e4\") " pod="openstack/barbican-0a22-account-create-qrrp9" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.957762 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4170-account-create-6l78c"] Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.959470 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4170-account-create-6l78c" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.961546 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.964002 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0a22-account-create-qrrp9" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.965933 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4170-account-create-6l78c"] Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.968917 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p9fd\" (UniqueName: \"kubernetes.io/projected/468d7644-ca17-43aa-88b5-f4917044b91f-kube-api-access-2p9fd\") pod \"cinder-149f-account-create-lxndm\" (UID: \"468d7644-ca17-43aa-88b5-f4917044b91f\") " pod="openstack/cinder-149f-account-create-lxndm" Oct 02 07:00:56 crc kubenswrapper[4786]: I1002 07:00:56.982266 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p9fd\" (UniqueName: \"kubernetes.io/projected/468d7644-ca17-43aa-88b5-f4917044b91f-kube-api-access-2p9fd\") pod \"cinder-149f-account-create-lxndm\" (UID: \"468d7644-ca17-43aa-88b5-f4917044b91f\") " pod="openstack/cinder-149f-account-create-lxndm" Oct 02 07:00:57 crc kubenswrapper[4786]: I1002 07:00:57.061552 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-149f-account-create-lxndm" Oct 02 07:00:57 crc kubenswrapper[4786]: I1002 07:00:57.070313 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dp69\" (UniqueName: \"kubernetes.io/projected/7e8a0cbf-1247-443e-b393-b4981c99f28f-kube-api-access-9dp69\") pod \"neutron-4170-account-create-6l78c\" (UID: \"7e8a0cbf-1247-443e-b393-b4981c99f28f\") " pod="openstack/neutron-4170-account-create-6l78c" Oct 02 07:00:57 crc kubenswrapper[4786]: I1002 07:00:57.171670 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dp69\" (UniqueName: \"kubernetes.io/projected/7e8a0cbf-1247-443e-b393-b4981c99f28f-kube-api-access-9dp69\") pod \"neutron-4170-account-create-6l78c\" (UID: \"7e8a0cbf-1247-443e-b393-b4981c99f28f\") " pod="openstack/neutron-4170-account-create-6l78c" Oct 02 07:00:57 crc kubenswrapper[4786]: I1002 07:00:57.184324 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dp69\" (UniqueName: \"kubernetes.io/projected/7e8a0cbf-1247-443e-b393-b4981c99f28f-kube-api-access-9dp69\") pod \"neutron-4170-account-create-6l78c\" (UID: \"7e8a0cbf-1247-443e-b393-b4981c99f28f\") " pod="openstack/neutron-4170-account-create-6l78c" Oct 02 07:00:57 crc kubenswrapper[4786]: I1002 07:00:57.275875 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4170-account-create-6l78c" Oct 02 07:01:03 crc kubenswrapper[4786]: I1002 07:01:03.836882 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-sz2xw" Oct 02 07:01:03 crc kubenswrapper[4786]: I1002 07:01:03.956581 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce72c1de-6645-43e2-88bf-11f40a02d16c-combined-ca-bundle\") pod \"ce72c1de-6645-43e2-88bf-11f40a02d16c\" (UID: \"ce72c1de-6645-43e2-88bf-11f40a02d16c\") " Oct 02 07:01:03 crc kubenswrapper[4786]: I1002 07:01:03.956619 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snbjv\" (UniqueName: \"kubernetes.io/projected/ce72c1de-6645-43e2-88bf-11f40a02d16c-kube-api-access-snbjv\") pod \"ce72c1de-6645-43e2-88bf-11f40a02d16c\" (UID: \"ce72c1de-6645-43e2-88bf-11f40a02d16c\") " Oct 02 07:01:03 crc kubenswrapper[4786]: I1002 07:01:03.956682 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce72c1de-6645-43e2-88bf-11f40a02d16c-config-data\") pod \"ce72c1de-6645-43e2-88bf-11f40a02d16c\" (UID: \"ce72c1de-6645-43e2-88bf-11f40a02d16c\") " Oct 02 07:01:03 crc kubenswrapper[4786]: I1002 07:01:03.961826 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce72c1de-6645-43e2-88bf-11f40a02d16c-kube-api-access-snbjv" (OuterVolumeSpecName: "kube-api-access-snbjv") pod "ce72c1de-6645-43e2-88bf-11f40a02d16c" (UID: "ce72c1de-6645-43e2-88bf-11f40a02d16c"). InnerVolumeSpecName "kube-api-access-snbjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:03 crc kubenswrapper[4786]: I1002 07:01:03.975172 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce72c1de-6645-43e2-88bf-11f40a02d16c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce72c1de-6645-43e2-88bf-11f40a02d16c" (UID: "ce72c1de-6645-43e2-88bf-11f40a02d16c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:03 crc kubenswrapper[4786]: I1002 07:01:03.992808 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce72c1de-6645-43e2-88bf-11f40a02d16c-config-data" (OuterVolumeSpecName: "config-data") pod "ce72c1de-6645-43e2-88bf-11f40a02d16c" (UID: "ce72c1de-6645-43e2-88bf-11f40a02d16c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.057974 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce72c1de-6645-43e2-88bf-11f40a02d16c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.058006 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snbjv\" (UniqueName: \"kubernetes.io/projected/ce72c1de-6645-43e2-88bf-11f40a02d16c-kube-api-access-snbjv\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.058017 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce72c1de-6645-43e2-88bf-11f40a02d16c-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.144144 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-149f-account-create-lxndm"] Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.149828 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0a22-account-create-qrrp9"] Oct 02 07:01:04 crc kubenswrapper[4786]: W1002 07:01:04.152129 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded6d6948_364a_4b79_ba0b_7ea0645e36e4.slice/crio-db427aba8384cf7099be6f9fdb722904439b8aab8dc7555fe019c77b46fd1757 WatchSource:0}: Error finding container db427aba8384cf7099be6f9fdb722904439b8aab8dc7555fe019c77b46fd1757: Status 404 returned error can't find the container with id db427aba8384cf7099be6f9fdb722904439b8aab8dc7555fe019c77b46fd1757 Oct 02 07:01:04 crc kubenswrapper[4786]: W1002 07:01:04.154033 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod468d7644_ca17_43aa_88b5_f4917044b91f.slice/crio-3fadb1010c409b4edf2397a691749c9597bbb35a03731f26c0d43d96d06509a4 WatchSource:0}: Error finding container 3fadb1010c409b4edf2397a691749c9597bbb35a03731f26c0d43d96d06509a4: Status 404 returned error can't find the container with id 3fadb1010c409b4edf2397a691749c9597bbb35a03731f26c0d43d96d06509a4 Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.155385 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4170-account-create-6l78c"] Oct 02 07:01:04 crc kubenswrapper[4786]: W1002 07:01:04.158865 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e8a0cbf_1247_443e_b393_b4981c99f28f.slice/crio-fb8132ffae98ba016e22771b7be09c03d1d244c4e8c3dfd11c18baee16287dbd WatchSource:0}: Error finding container fb8132ffae98ba016e22771b7be09c03d1d244c4e8c3dfd11c18baee16287dbd: Status 404 returned error can't find the container with id fb8132ffae98ba016e22771b7be09c03d1d244c4e8c3dfd11c18baee16287dbd Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.265044 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4170-account-create-6l78c" event={"ID":"7e8a0cbf-1247-443e-b393-b4981c99f28f","Type":"ContainerStarted","Data":"fb8132ffae98ba016e22771b7be09c03d1d244c4e8c3dfd11c18baee16287dbd"} Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.266302 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-sz2xw" event={"ID":"ce72c1de-6645-43e2-88bf-11f40a02d16c","Type":"ContainerDied","Data":"b68a32fc787cf254118d6cc7b0726d1de084f7593af0d77d0cdc111423bcfc35"} Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.266315 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-sz2xw" Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.266326 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b68a32fc787cf254118d6cc7b0726d1de084f7593af0d77d0cdc111423bcfc35" Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.267784 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kbkxq" event={"ID":"205062cf-def7-4b01-b1cf-7f2e1d0ef398","Type":"ContainerStarted","Data":"995abe9d11b6677d42d75ca72fd216368afdf472ab80eca4fcdbd5575d43730d"} Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.269274 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0a22-account-create-qrrp9" event={"ID":"ed6d6948-364a-4b79-ba0b-7ea0645e36e4","Type":"ContainerStarted","Data":"db427aba8384cf7099be6f9fdb722904439b8aab8dc7555fe019c77b46fd1757"} Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.270203 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-149f-account-create-lxndm" event={"ID":"468d7644-ca17-43aa-88b5-f4917044b91f","Type":"ContainerStarted","Data":"3fadb1010c409b4edf2397a691749c9597bbb35a03731f26c0d43d96d06509a4"} Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.284400 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-kbkxq" podStartSLOduration=1.430109531 podStartE2EDuration="19.284389555s" podCreationTimestamp="2025-10-02 07:00:45 +0000 UTC" firstStartedPulling="2025-10-02 07:00:45.947335822 +0000 UTC m=+856.068518954" lastFinishedPulling="2025-10-02 07:01:03.801615847 +0000 UTC m=+873.922798978" observedRunningTime="2025-10-02 07:01:04.280422582 +0000 UTC m=+874.401605723" watchObservedRunningTime="2025-10-02 07:01:04.284389555 +0000 UTC m=+874.405572686" Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.951658 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fd57746bf-r6q9k"] Oct 02 07:01:04 crc kubenswrapper[4786]: E1002 07:01:04.952156 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce72c1de-6645-43e2-88bf-11f40a02d16c" containerName="keystone-db-sync" Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.952168 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce72c1de-6645-43e2-88bf-11f40a02d16c" containerName="keystone-db-sync" Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.952310 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce72c1de-6645-43e2-88bf-11f40a02d16c" containerName="keystone-db-sync" Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.953040 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.963227 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fd57746bf-r6q9k"] Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.969135 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-config\") pod \"dnsmasq-dns-6fd57746bf-r6q9k\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.969187 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-ovsdbserver-sb\") pod \"dnsmasq-dns-6fd57746bf-r6q9k\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.969249 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-dns-swift-storage-0\") pod \"dnsmasq-dns-6fd57746bf-r6q9k\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.969265 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-dns-svc\") pod \"dnsmasq-dns-6fd57746bf-r6q9k\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.969326 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-ovsdbserver-nb\") pod \"dnsmasq-dns-6fd57746bf-r6q9k\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:04 crc kubenswrapper[4786]: I1002 07:01:04.969341 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbwb7\" (UniqueName: \"kubernetes.io/projected/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-kube-api-access-nbwb7\") pod \"dnsmasq-dns-6fd57746bf-r6q9k\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.002385 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-k5dzw"] Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.003315 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.005839 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.005898 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.006020 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.006132 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v5blg" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.016664 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k5dzw"] Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.070705 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-ovsdbserver-nb\") pod \"dnsmasq-dns-6fd57746bf-r6q9k\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.070743 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbwb7\" (UniqueName: \"kubernetes.io/projected/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-kube-api-access-nbwb7\") pod \"dnsmasq-dns-6fd57746bf-r6q9k\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.070864 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-fernet-keys\") pod \"keystone-bootstrap-k5dzw\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.070913 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-config\") pod \"dnsmasq-dns-6fd57746bf-r6q9k\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.070944 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-combined-ca-bundle\") pod \"keystone-bootstrap-k5dzw\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.070982 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-credential-keys\") pod \"keystone-bootstrap-k5dzw\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.071010 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-scripts\") pod \"keystone-bootstrap-k5dzw\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.071028 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmxlg\" (UniqueName: \"kubernetes.io/projected/82760f71-1d30-41b9-9c1e-bc34539e270f-kube-api-access-lmxlg\") pod \"keystone-bootstrap-k5dzw\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.071046 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-ovsdbserver-sb\") pod \"dnsmasq-dns-6fd57746bf-r6q9k\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.071102 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-dns-swift-storage-0\") pod \"dnsmasq-dns-6fd57746bf-r6q9k\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.071118 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-config-data\") pod \"keystone-bootstrap-k5dzw\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.071133 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-dns-svc\") pod \"dnsmasq-dns-6fd57746bf-r6q9k\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.071399 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-ovsdbserver-nb\") pod \"dnsmasq-dns-6fd57746bf-r6q9k\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.071543 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-config\") pod \"dnsmasq-dns-6fd57746bf-r6q9k\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.071859 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-dns-svc\") pod \"dnsmasq-dns-6fd57746bf-r6q9k\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.072005 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-dns-swift-storage-0\") pod \"dnsmasq-dns-6fd57746bf-r6q9k\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.072094 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-ovsdbserver-sb\") pod \"dnsmasq-dns-6fd57746bf-r6q9k\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.088023 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbwb7\" (UniqueName: \"kubernetes.io/projected/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-kube-api-access-nbwb7\") pod \"dnsmasq-dns-6fd57746bf-r6q9k\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.115626 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.117255 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.119115 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.120233 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.128090 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.172899 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46da85da-85a6-4ff7-8410-c26ccd99967e-log-httpd\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.172972 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rblwg\" (UniqueName: \"kubernetes.io/projected/46da85da-85a6-4ff7-8410-c26ccd99967e-kube-api-access-rblwg\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.172999 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-fernet-keys\") pod \"keystone-bootstrap-k5dzw\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.173038 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.173056 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-combined-ca-bundle\") pod \"keystone-bootstrap-k5dzw\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.173082 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-credential-keys\") pod \"keystone-bootstrap-k5dzw\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.173107 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-scripts\") pod \"keystone-bootstrap-k5dzw\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.173123 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmxlg\" (UniqueName: \"kubernetes.io/projected/82760f71-1d30-41b9-9c1e-bc34539e270f-kube-api-access-lmxlg\") pod \"keystone-bootstrap-k5dzw\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.173143 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-scripts\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.173165 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46da85da-85a6-4ff7-8410-c26ccd99967e-run-httpd\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.173182 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.173198 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-config-data\") pod \"keystone-bootstrap-k5dzw\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.173214 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-config-data\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.176632 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-scripts\") pod \"keystone-bootstrap-k5dzw\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.178299 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-fernet-keys\") pod \"keystone-bootstrap-k5dzw\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.178514 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-credential-keys\") pod \"keystone-bootstrap-k5dzw\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.189972 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmxlg\" (UniqueName: \"kubernetes.io/projected/82760f71-1d30-41b9-9c1e-bc34539e270f-kube-api-access-lmxlg\") pod \"keystone-bootstrap-k5dzw\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.191328 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-combined-ca-bundle\") pod \"keystone-bootstrap-k5dzw\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.191334 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-config-data\") pod \"keystone-bootstrap-k5dzw\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.269059 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.275778 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rblwg\" (UniqueName: \"kubernetes.io/projected/46da85da-85a6-4ff7-8410-c26ccd99967e-kube-api-access-rblwg\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.275864 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.275953 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-scripts\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.275982 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46da85da-85a6-4ff7-8410-c26ccd99967e-run-httpd\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.276001 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.276025 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-config-data\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.276070 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46da85da-85a6-4ff7-8410-c26ccd99967e-log-httpd\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.276412 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46da85da-85a6-4ff7-8410-c26ccd99967e-log-httpd\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.276597 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46da85da-85a6-4ff7-8410-c26ccd99967e-run-httpd\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.280250 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fd57746bf-r6q9k"] Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.282070 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.284170 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-scripts\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.291904 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.298260 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-config-data\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.307116 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rblwg\" (UniqueName: \"kubernetes.io/projected/46da85da-85a6-4ff7-8410-c26ccd99967e-kube-api-access-rblwg\") pod \"ceilometer-0\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.319003 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.337111 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56796c56b5-nx44j"] Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.338431 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.357885 4786 generic.go:334] "Generic (PLEG): container finished" podID="7e8a0cbf-1247-443e-b393-b4981c99f28f" containerID="639fcb46b84eea7c9c255304e2221e76af4340976ae3b17477a61eb3c132b293" exitCode=0 Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.358130 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4170-account-create-6l78c" event={"ID":"7e8a0cbf-1247-443e-b393-b4981c99f28f","Type":"ContainerDied","Data":"639fcb46b84eea7c9c255304e2221e76af4340976ae3b17477a61eb3c132b293"} Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.359807 4786 generic.go:334] "Generic (PLEG): container finished" podID="ed6d6948-364a-4b79-ba0b-7ea0645e36e4" containerID="21914d2214c99cf7e1da2e1f961f98d5d21eecf347eabccdcf5a1bcc230c2642" exitCode=0 Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.359849 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0a22-account-create-qrrp9" event={"ID":"ed6d6948-364a-4b79-ba0b-7ea0645e36e4","Type":"ContainerDied","Data":"21914d2214c99cf7e1da2e1f961f98d5d21eecf347eabccdcf5a1bcc230c2642"} Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.361113 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56796c56b5-nx44j"] Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.361535 4786 generic.go:334] "Generic (PLEG): container finished" podID="468d7644-ca17-43aa-88b5-f4917044b91f" containerID="c6bb577cad8c79305fa456f676a35cb17dd34c66a74d65a8dd754b3e250f1df5" exitCode=0 Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.361632 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-149f-account-create-lxndm" event={"ID":"468d7644-ca17-43aa-88b5-f4917044b91f","Type":"ContainerDied","Data":"c6bb577cad8c79305fa456f676a35cb17dd34c66a74d65a8dd754b3e250f1df5"} Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.390041 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-zt9q9"] Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.393987 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zt9q9" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.397547 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zt9q9"] Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.400217 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.400354 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-mqwps" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.400530 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.432997 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.483318 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-ovsdbserver-nb\") pod \"dnsmasq-dns-56796c56b5-nx44j\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.483521 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-config\") pod \"dnsmasq-dns-56796c56b5-nx44j\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.483578 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-ovsdbserver-sb\") pod \"dnsmasq-dns-56796c56b5-nx44j\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.483598 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cc1379e-f398-4974-a6c1-408d344ff49c-scripts\") pod \"placement-db-sync-zt9q9\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " pod="openstack/placement-db-sync-zt9q9" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.483680 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-dns-svc\") pod \"dnsmasq-dns-56796c56b5-nx44j\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.483709 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cc1379e-f398-4974-a6c1-408d344ff49c-combined-ca-bundle\") pod \"placement-db-sync-zt9q9\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " pod="openstack/placement-db-sync-zt9q9" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.483723 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8ncn\" (UniqueName: \"kubernetes.io/projected/5cc1379e-f398-4974-a6c1-408d344ff49c-kube-api-access-h8ncn\") pod \"placement-db-sync-zt9q9\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " pod="openstack/placement-db-sync-zt9q9" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.483783 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cc1379e-f398-4974-a6c1-408d344ff49c-config-data\") pod \"placement-db-sync-zt9q9\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " pod="openstack/placement-db-sync-zt9q9" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.483835 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cc1379e-f398-4974-a6c1-408d344ff49c-logs\") pod \"placement-db-sync-zt9q9\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " pod="openstack/placement-db-sync-zt9q9" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.483849 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmqfl\" (UniqueName: \"kubernetes.io/projected/2d8adabb-6397-4678-b0de-87c2b7817f69-kube-api-access-rmqfl\") pod \"dnsmasq-dns-56796c56b5-nx44j\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.483886 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-dns-swift-storage-0\") pod \"dnsmasq-dns-56796c56b5-nx44j\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.585538 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-config\") pod \"dnsmasq-dns-56796c56b5-nx44j\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.585595 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-ovsdbserver-sb\") pod \"dnsmasq-dns-56796c56b5-nx44j\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.585619 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cc1379e-f398-4974-a6c1-408d344ff49c-scripts\") pod \"placement-db-sync-zt9q9\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " pod="openstack/placement-db-sync-zt9q9" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.585680 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-dns-svc\") pod \"dnsmasq-dns-56796c56b5-nx44j\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.585710 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cc1379e-f398-4974-a6c1-408d344ff49c-combined-ca-bundle\") pod \"placement-db-sync-zt9q9\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " pod="openstack/placement-db-sync-zt9q9" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.585727 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8ncn\" (UniqueName: \"kubernetes.io/projected/5cc1379e-f398-4974-a6c1-408d344ff49c-kube-api-access-h8ncn\") pod \"placement-db-sync-zt9q9\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " pod="openstack/placement-db-sync-zt9q9" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.585777 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cc1379e-f398-4974-a6c1-408d344ff49c-config-data\") pod \"placement-db-sync-zt9q9\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " pod="openstack/placement-db-sync-zt9q9" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.585814 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cc1379e-f398-4974-a6c1-408d344ff49c-logs\") pod \"placement-db-sync-zt9q9\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " pod="openstack/placement-db-sync-zt9q9" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.585831 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmqfl\" (UniqueName: \"kubernetes.io/projected/2d8adabb-6397-4678-b0de-87c2b7817f69-kube-api-access-rmqfl\") pod \"dnsmasq-dns-56796c56b5-nx44j\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.585864 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-dns-swift-storage-0\") pod \"dnsmasq-dns-56796c56b5-nx44j\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.585918 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-ovsdbserver-nb\") pod \"dnsmasq-dns-56796c56b5-nx44j\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.586782 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-ovsdbserver-sb\") pod \"dnsmasq-dns-56796c56b5-nx44j\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.587276 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-ovsdbserver-nb\") pod \"dnsmasq-dns-56796c56b5-nx44j\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.587557 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-dns-svc\") pod \"dnsmasq-dns-56796c56b5-nx44j\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.587987 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-config\") pod \"dnsmasq-dns-56796c56b5-nx44j\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.588491 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cc1379e-f398-4974-a6c1-408d344ff49c-logs\") pod \"placement-db-sync-zt9q9\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " pod="openstack/placement-db-sync-zt9q9" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.588743 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-dns-swift-storage-0\") pod \"dnsmasq-dns-56796c56b5-nx44j\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.593641 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cc1379e-f398-4974-a6c1-408d344ff49c-combined-ca-bundle\") pod \"placement-db-sync-zt9q9\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " pod="openstack/placement-db-sync-zt9q9" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.594381 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cc1379e-f398-4974-a6c1-408d344ff49c-scripts\") pod \"placement-db-sync-zt9q9\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " pod="openstack/placement-db-sync-zt9q9" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.594833 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cc1379e-f398-4974-a6c1-408d344ff49c-config-data\") pod \"placement-db-sync-zt9q9\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " pod="openstack/placement-db-sync-zt9q9" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.602885 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmqfl\" (UniqueName: \"kubernetes.io/projected/2d8adabb-6397-4678-b0de-87c2b7817f69-kube-api-access-rmqfl\") pod \"dnsmasq-dns-56796c56b5-nx44j\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.602904 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8ncn\" (UniqueName: \"kubernetes.io/projected/5cc1379e-f398-4974-a6c1-408d344ff49c-kube-api-access-h8ncn\") pod \"placement-db-sync-zt9q9\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " pod="openstack/placement-db-sync-zt9q9" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.675734 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.722049 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zt9q9" Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.768012 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-k5dzw"] Oct 02 07:01:05 crc kubenswrapper[4786]: W1002 07:01:05.769020 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82760f71_1d30_41b9_9c1e_bc34539e270f.slice/crio-2ecb3a42aa43f1c0fdf020e57ec44cc0195d25690dd5a581d6ade34c73480c6d WatchSource:0}: Error finding container 2ecb3a42aa43f1c0fdf020e57ec44cc0195d25690dd5a581d6ade34c73480c6d: Status 404 returned error can't find the container with id 2ecb3a42aa43f1c0fdf020e57ec44cc0195d25690dd5a581d6ade34c73480c6d Oct 02 07:01:05 crc kubenswrapper[4786]: I1002 07:01:05.862376 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fd57746bf-r6q9k"] Oct 02 07:01:05 crc kubenswrapper[4786]: W1002 07:01:05.867849 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0cb4956_cd07_4be1_94b9_0ba13aba70c7.slice/crio-79f578606a8b91c9e326d0e9175748393cf6a16cf3234e69a3e460f218255beb WatchSource:0}: Error finding container 79f578606a8b91c9e326d0e9175748393cf6a16cf3234e69a3e460f218255beb: Status 404 returned error can't find the container with id 79f578606a8b91c9e326d0e9175748393cf6a16cf3234e69a3e460f218255beb Oct 02 07:01:06 crc kubenswrapper[4786]: W1002 07:01:05.916629 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46da85da_85a6_4ff7_8410_c26ccd99967e.slice/crio-3a505dc4e7b8923b2818287c12249880107aa4b6ab2fe60695f0ae93dda4742a WatchSource:0}: Error finding container 3a505dc4e7b8923b2818287c12249880107aa4b6ab2fe60695f0ae93dda4742a: Status 404 returned error can't find the container with id 3a505dc4e7b8923b2818287c12249880107aa4b6ab2fe60695f0ae93dda4742a Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:05.916979 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.057151 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56796c56b5-nx44j"] Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.122506 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zt9q9"] Oct 02 07:01:06 crc kubenswrapper[4786]: W1002 07:01:06.132489 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d8adabb_6397_4678_b0de_87c2b7817f69.slice/crio-384a1b7dca4f1e91e3fe5c6b58781cf1bc2a0ffe4a5a90f8107b0c13d11f581e WatchSource:0}: Error finding container 384a1b7dca4f1e91e3fe5c6b58781cf1bc2a0ffe4a5a90f8107b0c13d11f581e: Status 404 returned error can't find the container with id 384a1b7dca4f1e91e3fe5c6b58781cf1bc2a0ffe4a5a90f8107b0c13d11f581e Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.368846 4786 generic.go:334] "Generic (PLEG): container finished" podID="2d8adabb-6397-4678-b0de-87c2b7817f69" containerID="ec0f1427a1838545bd2be50514677cc80317ad61e25ffb993491c15e7b892d4c" exitCode=0 Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.368960 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56796c56b5-nx44j" event={"ID":"2d8adabb-6397-4678-b0de-87c2b7817f69","Type":"ContainerDied","Data":"ec0f1427a1838545bd2be50514677cc80317ad61e25ffb993491c15e7b892d4c"} Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.369008 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56796c56b5-nx44j" event={"ID":"2d8adabb-6397-4678-b0de-87c2b7817f69","Type":"ContainerStarted","Data":"384a1b7dca4f1e91e3fe5c6b58781cf1bc2a0ffe4a5a90f8107b0c13d11f581e"} Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.370854 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k5dzw" event={"ID":"82760f71-1d30-41b9-9c1e-bc34539e270f","Type":"ContainerStarted","Data":"fdbd48719a6a31bfd0cbbfe5cb723475e0d8c36f5c3df1d981ed4ce4e9def17f"} Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.370881 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k5dzw" event={"ID":"82760f71-1d30-41b9-9c1e-bc34539e270f","Type":"ContainerStarted","Data":"2ecb3a42aa43f1c0fdf020e57ec44cc0195d25690dd5a581d6ade34c73480c6d"} Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.372093 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zt9q9" event={"ID":"5cc1379e-f398-4974-a6c1-408d344ff49c","Type":"ContainerStarted","Data":"698b1b99601477959d4e0871399c9f34e364017b49a51fee2958545a571cce25"} Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.374145 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46da85da-85a6-4ff7-8410-c26ccd99967e","Type":"ContainerStarted","Data":"3a505dc4e7b8923b2818287c12249880107aa4b6ab2fe60695f0ae93dda4742a"} Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.375189 4786 generic.go:334] "Generic (PLEG): container finished" podID="e0cb4956-cd07-4be1-94b9-0ba13aba70c7" containerID="b16cbd5d0d7207dd5422a3f10407111ca12ff65132206435d32076e2ab99f511" exitCode=0 Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.375416 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" event={"ID":"e0cb4956-cd07-4be1-94b9-0ba13aba70c7","Type":"ContainerDied","Data":"b16cbd5d0d7207dd5422a3f10407111ca12ff65132206435d32076e2ab99f511"} Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.375438 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" event={"ID":"e0cb4956-cd07-4be1-94b9-0ba13aba70c7","Type":"ContainerStarted","Data":"79f578606a8b91c9e326d0e9175748393cf6a16cf3234e69a3e460f218255beb"} Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.426090 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-k5dzw" podStartSLOduration=2.426075195 podStartE2EDuration="2.426075195s" podCreationTimestamp="2025-10-02 07:01:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:06.425282181 +0000 UTC m=+876.546465322" watchObservedRunningTime="2025-10-02 07:01:06.426075195 +0000 UTC m=+876.547258325" Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.660681 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0a22-account-create-qrrp9" Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.796513 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-149f-account-create-lxndm" Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.812602 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p9fd\" (UniqueName: \"kubernetes.io/projected/468d7644-ca17-43aa-88b5-f4917044b91f-kube-api-access-2p9fd\") pod \"468d7644-ca17-43aa-88b5-f4917044b91f\" (UID: \"468d7644-ca17-43aa-88b5-f4917044b91f\") " Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.812804 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js58z\" (UniqueName: \"kubernetes.io/projected/ed6d6948-364a-4b79-ba0b-7ea0645e36e4-kube-api-access-js58z\") pod \"ed6d6948-364a-4b79-ba0b-7ea0645e36e4\" (UID: \"ed6d6948-364a-4b79-ba0b-7ea0645e36e4\") " Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.825299 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/468d7644-ca17-43aa-88b5-f4917044b91f-kube-api-access-2p9fd" (OuterVolumeSpecName: "kube-api-access-2p9fd") pod "468d7644-ca17-43aa-88b5-f4917044b91f" (UID: "468d7644-ca17-43aa-88b5-f4917044b91f"). InnerVolumeSpecName "kube-api-access-2p9fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.825803 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed6d6948-364a-4b79-ba0b-7ea0645e36e4-kube-api-access-js58z" (OuterVolumeSpecName: "kube-api-access-js58z") pod "ed6d6948-364a-4b79-ba0b-7ea0645e36e4" (UID: "ed6d6948-364a-4b79-ba0b-7ea0645e36e4"). InnerVolumeSpecName "kube-api-access-js58z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.836441 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4170-account-create-6l78c" Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.842088 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.914127 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-dns-swift-storage-0\") pod \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.914182 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbwb7\" (UniqueName: \"kubernetes.io/projected/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-kube-api-access-nbwb7\") pod \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.914270 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-ovsdbserver-nb\") pod \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.914310 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dp69\" (UniqueName: \"kubernetes.io/projected/7e8a0cbf-1247-443e-b393-b4981c99f28f-kube-api-access-9dp69\") pod \"7e8a0cbf-1247-443e-b393-b4981c99f28f\" (UID: \"7e8a0cbf-1247-443e-b393-b4981c99f28f\") " Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.914347 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-dns-svc\") pod \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.914379 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-ovsdbserver-sb\") pod \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.914415 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-config\") pod \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\" (UID: \"e0cb4956-cd07-4be1-94b9-0ba13aba70c7\") " Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.914750 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js58z\" (UniqueName: \"kubernetes.io/projected/ed6d6948-364a-4b79-ba0b-7ea0645e36e4-kube-api-access-js58z\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.914766 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p9fd\" (UniqueName: \"kubernetes.io/projected/468d7644-ca17-43aa-88b5-f4917044b91f-kube-api-access-2p9fd\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.926714 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-kube-api-access-nbwb7" (OuterVolumeSpecName: "kube-api-access-nbwb7") pod "e0cb4956-cd07-4be1-94b9-0ba13aba70c7" (UID: "e0cb4956-cd07-4be1-94b9-0ba13aba70c7"). InnerVolumeSpecName "kube-api-access-nbwb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.926756 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e8a0cbf-1247-443e-b393-b4981c99f28f-kube-api-access-9dp69" (OuterVolumeSpecName: "kube-api-access-9dp69") pod "7e8a0cbf-1247-443e-b393-b4981c99f28f" (UID: "7e8a0cbf-1247-443e-b393-b4981c99f28f"). InnerVolumeSpecName "kube-api-access-9dp69". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.940668 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.942027 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-config" (OuterVolumeSpecName: "config") pod "e0cb4956-cd07-4be1-94b9-0ba13aba70c7" (UID: "e0cb4956-cd07-4be1-94b9-0ba13aba70c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.942075 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e0cb4956-cd07-4be1-94b9-0ba13aba70c7" (UID: "e0cb4956-cd07-4be1-94b9-0ba13aba70c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.946470 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e0cb4956-cd07-4be1-94b9-0ba13aba70c7" (UID: "e0cb4956-cd07-4be1-94b9-0ba13aba70c7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.948013 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e0cb4956-cd07-4be1-94b9-0ba13aba70c7" (UID: "e0cb4956-cd07-4be1-94b9-0ba13aba70c7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:06 crc kubenswrapper[4786]: I1002 07:01:06.957168 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e0cb4956-cd07-4be1-94b9-0ba13aba70c7" (UID: "e0cb4956-cd07-4be1-94b9-0ba13aba70c7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.016798 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.016831 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbwb7\" (UniqueName: \"kubernetes.io/projected/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-kube-api-access-nbwb7\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.016844 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.016854 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dp69\" (UniqueName: \"kubernetes.io/projected/7e8a0cbf-1247-443e-b393-b4981c99f28f-kube-api-access-9dp69\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.016864 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.016871 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.016879 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0cb4956-cd07-4be1-94b9-0ba13aba70c7-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.384027 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56796c56b5-nx44j" event={"ID":"2d8adabb-6397-4678-b0de-87c2b7817f69","Type":"ContainerStarted","Data":"35ed671b0da42a9d6d556c7a69ca8e57d63886cd200c2a0d87c00fd375fa7695"} Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.384107 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.385774 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4170-account-create-6l78c" Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.385765 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4170-account-create-6l78c" event={"ID":"7e8a0cbf-1247-443e-b393-b4981c99f28f","Type":"ContainerDied","Data":"fb8132ffae98ba016e22771b7be09c03d1d244c4e8c3dfd11c18baee16287dbd"} Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.385804 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb8132ffae98ba016e22771b7be09c03d1d244c4e8c3dfd11c18baee16287dbd" Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.387780 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0a22-account-create-qrrp9" event={"ID":"ed6d6948-364a-4b79-ba0b-7ea0645e36e4","Type":"ContainerDied","Data":"db427aba8384cf7099be6f9fdb722904439b8aab8dc7555fe019c77b46fd1757"} Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.387802 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db427aba8384cf7099be6f9fdb722904439b8aab8dc7555fe019c77b46fd1757" Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.387837 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0a22-account-create-qrrp9" Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.399485 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-149f-account-create-lxndm" event={"ID":"468d7644-ca17-43aa-88b5-f4917044b91f","Type":"ContainerDied","Data":"3fadb1010c409b4edf2397a691749c9597bbb35a03731f26c0d43d96d06509a4"} Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.399507 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fadb1010c409b4edf2397a691749c9597bbb35a03731f26c0d43d96d06509a4" Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.399536 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-149f-account-create-lxndm" Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.403864 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.405503 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd57746bf-r6q9k" event={"ID":"e0cb4956-cd07-4be1-94b9-0ba13aba70c7","Type":"ContainerDied","Data":"79f578606a8b91c9e326d0e9175748393cf6a16cf3234e69a3e460f218255beb"} Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.405537 4786 scope.go:117] "RemoveContainer" containerID="b16cbd5d0d7207dd5422a3f10407111ca12ff65132206435d32076e2ab99f511" Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.436186 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56796c56b5-nx44j" podStartSLOduration=2.436171231 podStartE2EDuration="2.436171231s" podCreationTimestamp="2025-10-02 07:01:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:07.402432918 +0000 UTC m=+877.523616059" watchObservedRunningTime="2025-10-02 07:01:07.436171231 +0000 UTC m=+877.557354363" Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.449057 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fd57746bf-r6q9k"] Oct 02 07:01:07 crc kubenswrapper[4786]: I1002 07:01:07.452883 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fd57746bf-r6q9k"] Oct 02 07:01:08 crc kubenswrapper[4786]: I1002 07:01:08.189766 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0cb4956-cd07-4be1-94b9-0ba13aba70c7" path="/var/lib/kubelet/pods/e0cb4956-cd07-4be1-94b9-0ba13aba70c7/volumes" Oct 02 07:01:08 crc kubenswrapper[4786]: I1002 07:01:08.413592 4786 generic.go:334] "Generic (PLEG): container finished" podID="82760f71-1d30-41b9-9c1e-bc34539e270f" containerID="fdbd48719a6a31bfd0cbbfe5cb723475e0d8c36f5c3df1d981ed4ce4e9def17f" exitCode=0 Oct 02 07:01:08 crc kubenswrapper[4786]: I1002 07:01:08.413675 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k5dzw" event={"ID":"82760f71-1d30-41b9-9c1e-bc34539e270f","Type":"ContainerDied","Data":"fdbd48719a6a31bfd0cbbfe5cb723475e0d8c36f5c3df1d981ed4ce4e9def17f"} Oct 02 07:01:09 crc kubenswrapper[4786]: I1002 07:01:09.426462 4786 generic.go:334] "Generic (PLEG): container finished" podID="205062cf-def7-4b01-b1cf-7f2e1d0ef398" containerID="995abe9d11b6677d42d75ca72fd216368afdf472ab80eca4fcdbd5575d43730d" exitCode=0 Oct 02 07:01:09 crc kubenswrapper[4786]: I1002 07:01:09.426551 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kbkxq" event={"ID":"205062cf-def7-4b01-b1cf-7f2e1d0ef398","Type":"ContainerDied","Data":"995abe9d11b6677d42d75ca72fd216368afdf472ab80eca4fcdbd5575d43730d"} Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.739786 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.745786 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kbkxq" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.874136 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-combined-ca-bundle\") pod \"82760f71-1d30-41b9-9c1e-bc34539e270f\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.874343 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-credential-keys\") pod \"82760f71-1d30-41b9-9c1e-bc34539e270f\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.874363 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/205062cf-def7-4b01-b1cf-7f2e1d0ef398-db-sync-config-data\") pod \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\" (UID: \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\") " Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.874394 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205062cf-def7-4b01-b1cf-7f2e1d0ef398-combined-ca-bundle\") pod \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\" (UID: \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\") " Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.874426 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-config-data\") pod \"82760f71-1d30-41b9-9c1e-bc34539e270f\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.874458 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmxlg\" (UniqueName: \"kubernetes.io/projected/82760f71-1d30-41b9-9c1e-bc34539e270f-kube-api-access-lmxlg\") pod \"82760f71-1d30-41b9-9c1e-bc34539e270f\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.874478 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205062cf-def7-4b01-b1cf-7f2e1d0ef398-config-data\") pod \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\" (UID: \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\") " Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.874514 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlxrz\" (UniqueName: \"kubernetes.io/projected/205062cf-def7-4b01-b1cf-7f2e1d0ef398-kube-api-access-xlxrz\") pod \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\" (UID: \"205062cf-def7-4b01-b1cf-7f2e1d0ef398\") " Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.874559 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-scripts\") pod \"82760f71-1d30-41b9-9c1e-bc34539e270f\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.874593 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-fernet-keys\") pod \"82760f71-1d30-41b9-9c1e-bc34539e270f\" (UID: \"82760f71-1d30-41b9-9c1e-bc34539e270f\") " Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.878765 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "82760f71-1d30-41b9-9c1e-bc34539e270f" (UID: "82760f71-1d30-41b9-9c1e-bc34539e270f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.878790 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82760f71-1d30-41b9-9c1e-bc34539e270f-kube-api-access-lmxlg" (OuterVolumeSpecName: "kube-api-access-lmxlg") pod "82760f71-1d30-41b9-9c1e-bc34539e270f" (UID: "82760f71-1d30-41b9-9c1e-bc34539e270f"). InnerVolumeSpecName "kube-api-access-lmxlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.881426 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "82760f71-1d30-41b9-9c1e-bc34539e270f" (UID: "82760f71-1d30-41b9-9c1e-bc34539e270f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.881767 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205062cf-def7-4b01-b1cf-7f2e1d0ef398-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "205062cf-def7-4b01-b1cf-7f2e1d0ef398" (UID: "205062cf-def7-4b01-b1cf-7f2e1d0ef398"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.884103 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/205062cf-def7-4b01-b1cf-7f2e1d0ef398-kube-api-access-xlxrz" (OuterVolumeSpecName: "kube-api-access-xlxrz") pod "205062cf-def7-4b01-b1cf-7f2e1d0ef398" (UID: "205062cf-def7-4b01-b1cf-7f2e1d0ef398"). InnerVolumeSpecName "kube-api-access-xlxrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.889142 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-scripts" (OuterVolumeSpecName: "scripts") pod "82760f71-1d30-41b9-9c1e-bc34539e270f" (UID: "82760f71-1d30-41b9-9c1e-bc34539e270f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.904554 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205062cf-def7-4b01-b1cf-7f2e1d0ef398-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "205062cf-def7-4b01-b1cf-7f2e1d0ef398" (UID: "205062cf-def7-4b01-b1cf-7f2e1d0ef398"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.907058 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82760f71-1d30-41b9-9c1e-bc34539e270f" (UID: "82760f71-1d30-41b9-9c1e-bc34539e270f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.913251 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-config-data" (OuterVolumeSpecName: "config-data") pod "82760f71-1d30-41b9-9c1e-bc34539e270f" (UID: "82760f71-1d30-41b9-9c1e-bc34539e270f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.922342 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/205062cf-def7-4b01-b1cf-7f2e1d0ef398-config-data" (OuterVolumeSpecName: "config-data") pod "205062cf-def7-4b01-b1cf-7f2e1d0ef398" (UID: "205062cf-def7-4b01-b1cf-7f2e1d0ef398"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.976479 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmxlg\" (UniqueName: \"kubernetes.io/projected/82760f71-1d30-41b9-9c1e-bc34539e270f-kube-api-access-lmxlg\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.976508 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/205062cf-def7-4b01-b1cf-7f2e1d0ef398-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.976518 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlxrz\" (UniqueName: \"kubernetes.io/projected/205062cf-def7-4b01-b1cf-7f2e1d0ef398-kube-api-access-xlxrz\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.976527 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.976535 4786 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.976543 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.976550 4786 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.976558 4786 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/205062cf-def7-4b01-b1cf-7f2e1d0ef398-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.976565 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/205062cf-def7-4b01-b1cf-7f2e1d0ef398-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:10 crc kubenswrapper[4786]: I1002 07:01:10.976572 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82760f71-1d30-41b9-9c1e-bc34539e270f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.442772 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kbkxq" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.442767 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kbkxq" event={"ID":"205062cf-def7-4b01-b1cf-7f2e1d0ef398","Type":"ContainerDied","Data":"994ecdb9ee8fdb4e174a5736cc23f1a32a3003f91ed44e4127107c70d8f3d010"} Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.442812 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="994ecdb9ee8fdb4e174a5736cc23f1a32a3003f91ed44e4127107c70d8f3d010" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.445845 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-k5dzw" event={"ID":"82760f71-1d30-41b9-9c1e-bc34539e270f","Type":"ContainerDied","Data":"2ecb3a42aa43f1c0fdf020e57ec44cc0195d25690dd5a581d6ade34c73480c6d"} Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.445882 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ecb3a42aa43f1c0fdf020e57ec44cc0195d25690dd5a581d6ade34c73480c6d" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.445941 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-k5dzw" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.685928 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56796c56b5-nx44j"] Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.686283 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56796c56b5-nx44j" podUID="2d8adabb-6397-4678-b0de-87c2b7817f69" containerName="dnsmasq-dns" containerID="cri-o://35ed671b0da42a9d6d556c7a69ca8e57d63886cd200c2a0d87c00fd375fa7695" gracePeriod=10 Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.693913 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.710305 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5f8875c-7pgzk"] Oct 02 07:01:11 crc kubenswrapper[4786]: E1002 07:01:11.710598 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="205062cf-def7-4b01-b1cf-7f2e1d0ef398" containerName="glance-db-sync" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.710616 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="205062cf-def7-4b01-b1cf-7f2e1d0ef398" containerName="glance-db-sync" Oct 02 07:01:11 crc kubenswrapper[4786]: E1002 07:01:11.710633 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6d6948-364a-4b79-ba0b-7ea0645e36e4" containerName="mariadb-account-create" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.710640 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6d6948-364a-4b79-ba0b-7ea0645e36e4" containerName="mariadb-account-create" Oct 02 07:01:11 crc kubenswrapper[4786]: E1002 07:01:11.710658 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82760f71-1d30-41b9-9c1e-bc34539e270f" containerName="keystone-bootstrap" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.710665 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="82760f71-1d30-41b9-9c1e-bc34539e270f" containerName="keystone-bootstrap" Oct 02 07:01:11 crc kubenswrapper[4786]: E1002 07:01:11.710679 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e8a0cbf-1247-443e-b393-b4981c99f28f" containerName="mariadb-account-create" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.710685 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8a0cbf-1247-443e-b393-b4981c99f28f" containerName="mariadb-account-create" Oct 02 07:01:11 crc kubenswrapper[4786]: E1002 07:01:11.710728 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="468d7644-ca17-43aa-88b5-f4917044b91f" containerName="mariadb-account-create" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.710735 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="468d7644-ca17-43aa-88b5-f4917044b91f" containerName="mariadb-account-create" Oct 02 07:01:11 crc kubenswrapper[4786]: E1002 07:01:11.710746 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0cb4956-cd07-4be1-94b9-0ba13aba70c7" containerName="init" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.710751 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0cb4956-cd07-4be1-94b9-0ba13aba70c7" containerName="init" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.710909 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="205062cf-def7-4b01-b1cf-7f2e1d0ef398" containerName="glance-db-sync" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.710922 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed6d6948-364a-4b79-ba0b-7ea0645e36e4" containerName="mariadb-account-create" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.710930 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e8a0cbf-1247-443e-b393-b4981c99f28f" containerName="mariadb-account-create" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.710945 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0cb4956-cd07-4be1-94b9-0ba13aba70c7" containerName="init" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.710954 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="468d7644-ca17-43aa-88b5-f4917044b91f" containerName="mariadb-account-create" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.710964 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="82760f71-1d30-41b9-9c1e-bc34539e270f" containerName="keystone-bootstrap" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.711662 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.729458 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5f8875c-7pgzk"] Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.850807 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-k5dzw"] Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.856231 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-k5dzw"] Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.891193 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-dns-svc\") pod \"dnsmasq-dns-5c5f8875c-7pgzk\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.891233 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-config\") pod \"dnsmasq-dns-5c5f8875c-7pgzk\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.891308 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqwx7\" (UniqueName: \"kubernetes.io/projected/861e1d0e-bfec-4adf-8712-5ff000e0cf87-kube-api-access-hqwx7\") pod \"dnsmasq-dns-5c5f8875c-7pgzk\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.891329 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5f8875c-7pgzk\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.891361 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5f8875c-7pgzk\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.891467 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5f8875c-7pgzk\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.955382 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-lbkdj"] Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.956364 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.958026 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.958250 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.958430 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v5blg" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.959817 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.962312 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lbkdj"] Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.992790 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5f8875c-7pgzk\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.992862 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-dns-svc\") pod \"dnsmasq-dns-5c5f8875c-7pgzk\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.992886 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-config\") pod \"dnsmasq-dns-5c5f8875c-7pgzk\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.992963 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqwx7\" (UniqueName: \"kubernetes.io/projected/861e1d0e-bfec-4adf-8712-5ff000e0cf87-kube-api-access-hqwx7\") pod \"dnsmasq-dns-5c5f8875c-7pgzk\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.992982 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5f8875c-7pgzk\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.993014 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5f8875c-7pgzk\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.993719 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5f8875c-7pgzk\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.993767 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-dns-svc\") pod \"dnsmasq-dns-5c5f8875c-7pgzk\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.993934 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5f8875c-7pgzk\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.993943 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5f8875c-7pgzk\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:11 crc kubenswrapper[4786]: I1002 07:01:11.994477 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-config\") pod \"dnsmasq-dns-5c5f8875c-7pgzk\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.009932 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqwx7\" (UniqueName: \"kubernetes.io/projected/861e1d0e-bfec-4adf-8712-5ff000e0cf87-kube-api-access-hqwx7\") pod \"dnsmasq-dns-5c5f8875c-7pgzk\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.030469 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.052345 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-xt8ms"] Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.053320 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xt8ms" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.056285 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-s2wrb" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.056312 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.059229 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xt8ms"] Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.096516 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-config-data\") pod \"keystone-bootstrap-lbkdj\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.096597 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-combined-ca-bundle\") pod \"keystone-bootstrap-lbkdj\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.096833 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-fernet-keys\") pod \"keystone-bootstrap-lbkdj\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.096912 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-credential-keys\") pod \"keystone-bootstrap-lbkdj\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.096940 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-scripts\") pod \"keystone-bootstrap-lbkdj\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.096977 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxf6z\" (UniqueName: \"kubernetes.io/projected/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-kube-api-access-kxf6z\") pod \"keystone-bootstrap-lbkdj\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.191738 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82760f71-1d30-41b9-9c1e-bc34539e270f" path="/var/lib/kubelet/pods/82760f71-1d30-41b9-9c1e-bc34539e270f/volumes" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.198471 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-config-data\") pod \"keystone-bootstrap-lbkdj\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.198527 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-combined-ca-bundle\") pod \"keystone-bootstrap-lbkdj\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.198603 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f7b97c5-b382-4fdf-bb25-b384b16eb1f6-db-sync-config-data\") pod \"barbican-db-sync-xt8ms\" (UID: \"9f7b97c5-b382-4fdf-bb25-b384b16eb1f6\") " pod="openstack/barbican-db-sync-xt8ms" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.198669 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-fernet-keys\") pod \"keystone-bootstrap-lbkdj\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.199342 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-credential-keys\") pod \"keystone-bootstrap-lbkdj\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.199376 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f7b97c5-b382-4fdf-bb25-b384b16eb1f6-combined-ca-bundle\") pod \"barbican-db-sync-xt8ms\" (UID: \"9f7b97c5-b382-4fdf-bb25-b384b16eb1f6\") " pod="openstack/barbican-db-sync-xt8ms" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.199397 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-scripts\") pod \"keystone-bootstrap-lbkdj\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.199444 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxf6z\" (UniqueName: \"kubernetes.io/projected/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-kube-api-access-kxf6z\") pod \"keystone-bootstrap-lbkdj\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.199465 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rqpg\" (UniqueName: \"kubernetes.io/projected/9f7b97c5-b382-4fdf-bb25-b384b16eb1f6-kube-api-access-6rqpg\") pod \"barbican-db-sync-xt8ms\" (UID: \"9f7b97c5-b382-4fdf-bb25-b384b16eb1f6\") " pod="openstack/barbican-db-sync-xt8ms" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.202270 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-fernet-keys\") pod \"keystone-bootstrap-lbkdj\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.204277 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-config-data\") pod \"keystone-bootstrap-lbkdj\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.204312 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-scripts\") pod \"keystone-bootstrap-lbkdj\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.204908 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-combined-ca-bundle\") pod \"keystone-bootstrap-lbkdj\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.205202 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-credential-keys\") pod \"keystone-bootstrap-lbkdj\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.215003 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxf6z\" (UniqueName: \"kubernetes.io/projected/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-kube-api-access-kxf6z\") pod \"keystone-bootstrap-lbkdj\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.268779 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-f42fq"] Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.269891 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-f42fq" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.271896 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.272067 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-84r2m" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.272204 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.273972 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.274426 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-g9gtn"] Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.275243 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.281547 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-v4bzm" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.281741 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.281843 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.281961 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-f42fq"] Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.286630 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-g9gtn"] Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.300920 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f7b97c5-b382-4fdf-bb25-b384b16eb1f6-combined-ca-bundle\") pod \"barbican-db-sync-xt8ms\" (UID: \"9f7b97c5-b382-4fdf-bb25-b384b16eb1f6\") " pod="openstack/barbican-db-sync-xt8ms" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.301027 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rqpg\" (UniqueName: \"kubernetes.io/projected/9f7b97c5-b382-4fdf-bb25-b384b16eb1f6-kube-api-access-6rqpg\") pod \"barbican-db-sync-xt8ms\" (UID: \"9f7b97c5-b382-4fdf-bb25-b384b16eb1f6\") " pod="openstack/barbican-db-sync-xt8ms" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.301268 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f7b97c5-b382-4fdf-bb25-b384b16eb1f6-db-sync-config-data\") pod \"barbican-db-sync-xt8ms\" (UID: \"9f7b97c5-b382-4fdf-bb25-b384b16eb1f6\") " pod="openstack/barbican-db-sync-xt8ms" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.307200 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f7b97c5-b382-4fdf-bb25-b384b16eb1f6-db-sync-config-data\") pod \"barbican-db-sync-xt8ms\" (UID: \"9f7b97c5-b382-4fdf-bb25-b384b16eb1f6\") " pod="openstack/barbican-db-sync-xt8ms" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.318896 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f7b97c5-b382-4fdf-bb25-b384b16eb1f6-combined-ca-bundle\") pod \"barbican-db-sync-xt8ms\" (UID: \"9f7b97c5-b382-4fdf-bb25-b384b16eb1f6\") " pod="openstack/barbican-db-sync-xt8ms" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.328978 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rqpg\" (UniqueName: \"kubernetes.io/projected/9f7b97c5-b382-4fdf-bb25-b384b16eb1f6-kube-api-access-6rqpg\") pod \"barbican-db-sync-xt8ms\" (UID: \"9f7b97c5-b382-4fdf-bb25-b384b16eb1f6\") " pod="openstack/barbican-db-sync-xt8ms" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.368664 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xt8ms" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.403082 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca674644-5646-452d-ba2a-5ff2844f64ea-combined-ca-bundle\") pod \"neutron-db-sync-f42fq\" (UID: \"ca674644-5646-452d-ba2a-5ff2844f64ea\") " pod="openstack/neutron-db-sync-f42fq" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.403117 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n7xf\" (UniqueName: \"kubernetes.io/projected/ca674644-5646-452d-ba2a-5ff2844f64ea-kube-api-access-5n7xf\") pod \"neutron-db-sync-f42fq\" (UID: \"ca674644-5646-452d-ba2a-5ff2844f64ea\") " pod="openstack/neutron-db-sync-f42fq" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.403138 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-etc-machine-id\") pod \"cinder-db-sync-g9gtn\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.403327 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-db-sync-config-data\") pod \"cinder-db-sync-g9gtn\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.403400 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mp25\" (UniqueName: \"kubernetes.io/projected/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-kube-api-access-8mp25\") pod \"cinder-db-sync-g9gtn\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.403437 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-config-data\") pod \"cinder-db-sync-g9gtn\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.403467 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca674644-5646-452d-ba2a-5ff2844f64ea-config\") pod \"neutron-db-sync-f42fq\" (UID: \"ca674644-5646-452d-ba2a-5ff2844f64ea\") " pod="openstack/neutron-db-sync-f42fq" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.403497 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-combined-ca-bundle\") pod \"cinder-db-sync-g9gtn\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.403606 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-scripts\") pod \"cinder-db-sync-g9gtn\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.454188 4786 generic.go:334] "Generic (PLEG): container finished" podID="2d8adabb-6397-4678-b0de-87c2b7817f69" containerID="35ed671b0da42a9d6d556c7a69ca8e57d63886cd200c2a0d87c00fd375fa7695" exitCode=0 Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.454208 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56796c56b5-nx44j" event={"ID":"2d8adabb-6397-4678-b0de-87c2b7817f69","Type":"ContainerDied","Data":"35ed671b0da42a9d6d556c7a69ca8e57d63886cd200c2a0d87c00fd375fa7695"} Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.504972 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-combined-ca-bundle\") pod \"cinder-db-sync-g9gtn\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.505038 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-scripts\") pod \"cinder-db-sync-g9gtn\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.505173 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca674644-5646-452d-ba2a-5ff2844f64ea-combined-ca-bundle\") pod \"neutron-db-sync-f42fq\" (UID: \"ca674644-5646-452d-ba2a-5ff2844f64ea\") " pod="openstack/neutron-db-sync-f42fq" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.505198 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n7xf\" (UniqueName: \"kubernetes.io/projected/ca674644-5646-452d-ba2a-5ff2844f64ea-kube-api-access-5n7xf\") pod \"neutron-db-sync-f42fq\" (UID: \"ca674644-5646-452d-ba2a-5ff2844f64ea\") " pod="openstack/neutron-db-sync-f42fq" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.505215 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-etc-machine-id\") pod \"cinder-db-sync-g9gtn\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.505293 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-db-sync-config-data\") pod \"cinder-db-sync-g9gtn\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.505352 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mp25\" (UniqueName: \"kubernetes.io/projected/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-kube-api-access-8mp25\") pod \"cinder-db-sync-g9gtn\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.505355 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-etc-machine-id\") pod \"cinder-db-sync-g9gtn\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.505376 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-config-data\") pod \"cinder-db-sync-g9gtn\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.505573 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca674644-5646-452d-ba2a-5ff2844f64ea-config\") pod \"neutron-db-sync-f42fq\" (UID: \"ca674644-5646-452d-ba2a-5ff2844f64ea\") " pod="openstack/neutron-db-sync-f42fq" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.510053 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca674644-5646-452d-ba2a-5ff2844f64ea-combined-ca-bundle\") pod \"neutron-db-sync-f42fq\" (UID: \"ca674644-5646-452d-ba2a-5ff2844f64ea\") " pod="openstack/neutron-db-sync-f42fq" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.510077 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-db-sync-config-data\") pod \"cinder-db-sync-g9gtn\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.510572 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-scripts\") pod \"cinder-db-sync-g9gtn\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.510753 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca674644-5646-452d-ba2a-5ff2844f64ea-config\") pod \"neutron-db-sync-f42fq\" (UID: \"ca674644-5646-452d-ba2a-5ff2844f64ea\") " pod="openstack/neutron-db-sync-f42fq" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.510865 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-combined-ca-bundle\") pod \"cinder-db-sync-g9gtn\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.510995 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-config-data\") pod \"cinder-db-sync-g9gtn\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.520109 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n7xf\" (UniqueName: \"kubernetes.io/projected/ca674644-5646-452d-ba2a-5ff2844f64ea-kube-api-access-5n7xf\") pod \"neutron-db-sync-f42fq\" (UID: \"ca674644-5646-452d-ba2a-5ff2844f64ea\") " pod="openstack/neutron-db-sync-f42fq" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.520204 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mp25\" (UniqueName: \"kubernetes.io/projected/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-kube-api-access-8mp25\") pod \"cinder-db-sync-g9gtn\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.591372 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-f42fq" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.597581 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.701316 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.702646 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.705096 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8rkqb" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.705286 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.706366 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.708495 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.779947 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.781132 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.782602 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.792118 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.809122 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8addfa7-2397-45de-82ed-5932b165ca3b-logs\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.809290 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.809332 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8addfa7-2397-45de-82ed-5932b165ca3b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.809423 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8addfa7-2397-45de-82ed-5932b165ca3b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.809456 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8addfa7-2397-45de-82ed-5932b165ca3b-config-data\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.809492 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8addfa7-2397-45de-82ed-5932b165ca3b-scripts\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.809606 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jqpz\" (UniqueName: \"kubernetes.io/projected/c8addfa7-2397-45de-82ed-5932b165ca3b-kube-api-access-9jqpz\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.925915 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz7kj\" (UniqueName: \"kubernetes.io/projected/bb4539ca-be53-44b5-acce-23af3a79326a-kube-api-access-bz7kj\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.926048 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8addfa7-2397-45de-82ed-5932b165ca3b-logs\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.926142 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4539ca-be53-44b5-acce-23af3a79326a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.926175 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb4539ca-be53-44b5-acce-23af3a79326a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.926220 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.926262 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.926292 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb4539ca-be53-44b5-acce-23af3a79326a-logs\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.926638 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.926679 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8addfa7-2397-45de-82ed-5932b165ca3b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.926725 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8addfa7-2397-45de-82ed-5932b165ca3b-logs\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.927175 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8addfa7-2397-45de-82ed-5932b165ca3b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.931819 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4539ca-be53-44b5-acce-23af3a79326a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.931912 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8addfa7-2397-45de-82ed-5932b165ca3b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.931945 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8addfa7-2397-45de-82ed-5932b165ca3b-config-data\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.931994 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8addfa7-2397-45de-82ed-5932b165ca3b-scripts\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.932070 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4539ca-be53-44b5-acce-23af3a79326a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.932136 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jqpz\" (UniqueName: \"kubernetes.io/projected/c8addfa7-2397-45de-82ed-5932b165ca3b-kube-api-access-9jqpz\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.952220 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8addfa7-2397-45de-82ed-5932b165ca3b-config-data\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.958595 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8addfa7-2397-45de-82ed-5932b165ca3b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.974557 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jqpz\" (UniqueName: \"kubernetes.io/projected/c8addfa7-2397-45de-82ed-5932b165ca3b-kube-api-access-9jqpz\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:12 crc kubenswrapper[4786]: I1002 07:01:12.979140 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8addfa7-2397-45de-82ed-5932b165ca3b-scripts\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.007270 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.033699 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.033847 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.033935 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb4539ca-be53-44b5-acce-23af3a79326a-logs\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.034024 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4539ca-be53-44b5-acce-23af3a79326a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.034135 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4539ca-be53-44b5-acce-23af3a79326a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.034220 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz7kj\" (UniqueName: \"kubernetes.io/projected/bb4539ca-be53-44b5-acce-23af3a79326a-kube-api-access-bz7kj\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.034318 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4539ca-be53-44b5-acce-23af3a79326a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.034376 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb4539ca-be53-44b5-acce-23af3a79326a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.034461 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb4539ca-be53-44b5-acce-23af3a79326a-logs\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.034874 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb4539ca-be53-44b5-acce-23af3a79326a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.039244 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4539ca-be53-44b5-acce-23af3a79326a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.039604 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4539ca-be53-44b5-acce-23af3a79326a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.042852 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4539ca-be53-44b5-acce-23af3a79326a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.050090 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz7kj\" (UniqueName: \"kubernetes.io/projected/bb4539ca-be53-44b5-acce-23af3a79326a-kube-api-access-bz7kj\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.056807 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.097355 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.320190 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.429698 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.489122 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56796c56b5-nx44j" event={"ID":"2d8adabb-6397-4678-b0de-87c2b7817f69","Type":"ContainerDied","Data":"384a1b7dca4f1e91e3fe5c6b58781cf1bc2a0ffe4a5a90f8107b0c13d11f581e"} Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.489373 4786 scope.go:117] "RemoveContainer" containerID="35ed671b0da42a9d6d556c7a69ca8e57d63886cd200c2a0d87c00fd375fa7695" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.489518 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56796c56b5-nx44j" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.533784 4786 scope.go:117] "RemoveContainer" containerID="ec0f1427a1838545bd2be50514677cc80317ad61e25ffb993491c15e7b892d4c" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.545520 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmqfl\" (UniqueName: \"kubernetes.io/projected/2d8adabb-6397-4678-b0de-87c2b7817f69-kube-api-access-rmqfl\") pod \"2d8adabb-6397-4678-b0de-87c2b7817f69\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.545602 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-dns-svc\") pod \"2d8adabb-6397-4678-b0de-87c2b7817f69\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.545620 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-ovsdbserver-sb\") pod \"2d8adabb-6397-4678-b0de-87c2b7817f69\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.545646 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-config\") pod \"2d8adabb-6397-4678-b0de-87c2b7817f69\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.545682 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-dns-swift-storage-0\") pod \"2d8adabb-6397-4678-b0de-87c2b7817f69\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.545726 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-ovsdbserver-nb\") pod \"2d8adabb-6397-4678-b0de-87c2b7817f69\" (UID: \"2d8adabb-6397-4678-b0de-87c2b7817f69\") " Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.549436 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d8adabb-6397-4678-b0de-87c2b7817f69-kube-api-access-rmqfl" (OuterVolumeSpecName: "kube-api-access-rmqfl") pod "2d8adabb-6397-4678-b0de-87c2b7817f69" (UID: "2d8adabb-6397-4678-b0de-87c2b7817f69"). InnerVolumeSpecName "kube-api-access-rmqfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.582739 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-config" (OuterVolumeSpecName: "config") pod "2d8adabb-6397-4678-b0de-87c2b7817f69" (UID: "2d8adabb-6397-4678-b0de-87c2b7817f69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.582859 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2d8adabb-6397-4678-b0de-87c2b7817f69" (UID: "2d8adabb-6397-4678-b0de-87c2b7817f69"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.585245 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2d8adabb-6397-4678-b0de-87c2b7817f69" (UID: "2d8adabb-6397-4678-b0de-87c2b7817f69"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.588038 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2d8adabb-6397-4678-b0de-87c2b7817f69" (UID: "2d8adabb-6397-4678-b0de-87c2b7817f69"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.589510 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2d8adabb-6397-4678-b0de-87c2b7817f69" (UID: "2d8adabb-6397-4678-b0de-87c2b7817f69"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.628865 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-f42fq"] Oct 02 07:01:13 crc kubenswrapper[4786]: W1002 07:01:13.635621 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca674644_5646_452d_ba2a_5ff2844f64ea.slice/crio-0904eb1b933d3d8996f538e62e7134a30cfcaf2f76d4010a3246e6fb48258819 WatchSource:0}: Error finding container 0904eb1b933d3d8996f538e62e7134a30cfcaf2f76d4010a3246e6fb48258819: Status 404 returned error can't find the container with id 0904eb1b933d3d8996f538e62e7134a30cfcaf2f76d4010a3246e6fb48258819 Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.657277 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.657300 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.657309 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.657317 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.657326 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d8adabb-6397-4678-b0de-87c2b7817f69-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.657333 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmqfl\" (UniqueName: \"kubernetes.io/projected/2d8adabb-6397-4678-b0de-87c2b7817f69-kube-api-access-rmqfl\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.743334 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xt8ms"] Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.754199 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5f8875c-7pgzk"] Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.813487 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.820705 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56796c56b5-nx44j"] Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.825858 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56796c56b5-nx44j"] Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.861167 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-g9gtn"] Oct 02 07:01:13 crc kubenswrapper[4786]: I1002 07:01:13.866569 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lbkdj"] Oct 02 07:01:13 crc kubenswrapper[4786]: W1002 07:01:13.874676 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod609b41c2_45b6_4b81_abe6_2ac3fe5ffdcd.slice/crio-ff064ccd007db3ebad58f98aea233a0b117745be4f5fbfa037f2c44c10268a4c WatchSource:0}: Error finding container ff064ccd007db3ebad58f98aea233a0b117745be4f5fbfa037f2c44c10268a4c: Status 404 returned error can't find the container with id ff064ccd007db3ebad58f98aea233a0b117745be4f5fbfa037f2c44c10268a4c Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.007151 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 07:01:14 crc kubenswrapper[4786]: W1002 07:01:14.014707 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8addfa7_2397_45de_82ed_5932b165ca3b.slice/crio-a705b85dce79b60540ac965384f3c6646f500ac06e64828b5ff8d253419180bd WatchSource:0}: Error finding container a705b85dce79b60540ac965384f3c6646f500ac06e64828b5ff8d253419180bd: Status 404 returned error can't find the container with id a705b85dce79b60540ac965384f3c6646f500ac06e64828b5ff8d253419180bd Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.189229 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d8adabb-6397-4678-b0de-87c2b7817f69" path="/var/lib/kubelet/pods/2d8adabb-6397-4678-b0de-87c2b7817f69/volumes" Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.496519 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46da85da-85a6-4ff7-8410-c26ccd99967e","Type":"ContainerStarted","Data":"acea46ce74fcd747001e70e1a189e72bcbd4c85e213c49055b801c8197f289f7"} Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.497914 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g9gtn" event={"ID":"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd","Type":"ContainerStarted","Data":"ff064ccd007db3ebad58f98aea233a0b117745be4f5fbfa037f2c44c10268a4c"} Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.499807 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lbkdj" event={"ID":"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46","Type":"ContainerStarted","Data":"183be18201f0cbf05314f48a2d7db651df44eef2a3a372b53ef59fe0cb41c8ff"} Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.499828 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lbkdj" event={"ID":"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46","Type":"ContainerStarted","Data":"8949edb9879b527e6e9d99488e52eb00ff63895692feb5803e2b24296d66e9c6"} Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.502627 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zt9q9" event={"ID":"5cc1379e-f398-4974-a6c1-408d344ff49c","Type":"ContainerStarted","Data":"32a54afef2916bd13e35f5eb42c0de64c9d3e3df5caa85e9f12197c61260ed1f"} Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.504709 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c8addfa7-2397-45de-82ed-5932b165ca3b","Type":"ContainerStarted","Data":"638dfae83d23bca16b4ddecdb94531213c5e8a8c336828a77241d8a154686dff"} Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.504733 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c8addfa7-2397-45de-82ed-5932b165ca3b","Type":"ContainerStarted","Data":"a705b85dce79b60540ac965384f3c6646f500ac06e64828b5ff8d253419180bd"} Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.507314 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-f42fq" event={"ID":"ca674644-5646-452d-ba2a-5ff2844f64ea","Type":"ContainerStarted","Data":"2180bad403aa120ca73eb4c7fc08aaad33410b1a1b15857057521c74dbd589cc"} Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.507338 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-f42fq" event={"ID":"ca674644-5646-452d-ba2a-5ff2844f64ea","Type":"ContainerStarted","Data":"0904eb1b933d3d8996f538e62e7134a30cfcaf2f76d4010a3246e6fb48258819"} Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.508715 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xt8ms" event={"ID":"9f7b97c5-b382-4fdf-bb25-b384b16eb1f6","Type":"ContainerStarted","Data":"48806de6aa5dd587c633e238f0f7406b536afb19e9903c108522003e76df4a6d"} Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.510817 4786 generic.go:334] "Generic (PLEG): container finished" podID="861e1d0e-bfec-4adf-8712-5ff000e0cf87" containerID="c74f50a4a3e6f120af84bf422ef80846867a5a412b781f302fa1692f0b9f2d38" exitCode=0 Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.510859 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" event={"ID":"861e1d0e-bfec-4adf-8712-5ff000e0cf87","Type":"ContainerDied","Data":"c74f50a4a3e6f120af84bf422ef80846867a5a412b781f302fa1692f0b9f2d38"} Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.510875 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" event={"ID":"861e1d0e-bfec-4adf-8712-5ff000e0cf87","Type":"ContainerStarted","Data":"46804c0869411976fb54fa0a3de4024c50ddd542aaf0258ff66f4bc6463a9b6d"} Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.519671 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-lbkdj" podStartSLOduration=3.519571768 podStartE2EDuration="3.519571768s" podCreationTimestamp="2025-10-02 07:01:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:14.515911244 +0000 UTC m=+884.637094385" watchObservedRunningTime="2025-10-02 07:01:14.519571768 +0000 UTC m=+884.640754900" Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.523850 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb4539ca-be53-44b5-acce-23af3a79326a","Type":"ContainerStarted","Data":"dabfc4c5d3f544473787204ae8dcdc701226e8bcd47bc879d0ec079193f49b7f"} Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.523886 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb4539ca-be53-44b5-acce-23af3a79326a","Type":"ContainerStarted","Data":"a89a7474884ec3a85339a4160f8c8d7225967d5a602e00e5d21bceed96b77dc1"} Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.531804 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-zt9q9" podStartSLOduration=2.458362676 podStartE2EDuration="9.531791608s" podCreationTimestamp="2025-10-02 07:01:05 +0000 UTC" firstStartedPulling="2025-10-02 07:01:06.135005462 +0000 UTC m=+876.256188594" lastFinishedPulling="2025-10-02 07:01:13.208434395 +0000 UTC m=+883.329617526" observedRunningTime="2025-10-02 07:01:14.528023721 +0000 UTC m=+884.649206862" watchObservedRunningTime="2025-10-02 07:01:14.531791608 +0000 UTC m=+884.652974739" Oct 02 07:01:14 crc kubenswrapper[4786]: I1002 07:01:14.554274 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-f42fq" podStartSLOduration=2.554262184 podStartE2EDuration="2.554262184s" podCreationTimestamp="2025-10-02 07:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:14.553635792 +0000 UTC m=+884.674818943" watchObservedRunningTime="2025-10-02 07:01:14.554262184 +0000 UTC m=+884.675445315" Oct 02 07:01:15 crc kubenswrapper[4786]: I1002 07:01:15.094029 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 07:01:15 crc kubenswrapper[4786]: I1002 07:01:15.177274 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 07:01:15 crc kubenswrapper[4786]: I1002 07:01:15.536290 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c8addfa7-2397-45de-82ed-5932b165ca3b","Type":"ContainerStarted","Data":"1f46485d4425674a740ad217df757381c6e327c0cb503d7e9cc2c12fbf8b983c"} Oct 02 07:01:15 crc kubenswrapper[4786]: I1002 07:01:15.538620 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" event={"ID":"861e1d0e-bfec-4adf-8712-5ff000e0cf87","Type":"ContainerStarted","Data":"4a3d0243946f1570af7ae9bcc8bee9f2a50855ed511e60eccca286ce8b6668e1"} Oct 02 07:01:15 crc kubenswrapper[4786]: I1002 07:01:15.538725 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:15 crc kubenswrapper[4786]: I1002 07:01:15.540558 4786 generic.go:334] "Generic (PLEG): container finished" podID="5cc1379e-f398-4974-a6c1-408d344ff49c" containerID="32a54afef2916bd13e35f5eb42c0de64c9d3e3df5caa85e9f12197c61260ed1f" exitCode=0 Oct 02 07:01:15 crc kubenswrapper[4786]: I1002 07:01:15.540613 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zt9q9" event={"ID":"5cc1379e-f398-4974-a6c1-408d344ff49c","Type":"ContainerDied","Data":"32a54afef2916bd13e35f5eb42c0de64c9d3e3df5caa85e9f12197c61260ed1f"} Oct 02 07:01:15 crc kubenswrapper[4786]: I1002 07:01:15.542730 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb4539ca-be53-44b5-acce-23af3a79326a","Type":"ContainerStarted","Data":"464f85c88afd658c379f80e7df4bf33038bc919b39ac8bb2175d6b8ac2dae355"} Oct 02 07:01:15 crc kubenswrapper[4786]: I1002 07:01:15.557429 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.557400941 podStartE2EDuration="4.557400941s" podCreationTimestamp="2025-10-02 07:01:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:15.549984633 +0000 UTC m=+885.671167774" watchObservedRunningTime="2025-10-02 07:01:15.557400941 +0000 UTC m=+885.678584072" Oct 02 07:01:15 crc kubenswrapper[4786]: I1002 07:01:15.584009 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.583995887 podStartE2EDuration="4.583995887s" podCreationTimestamp="2025-10-02 07:01:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:15.570888793 +0000 UTC m=+885.692071924" watchObservedRunningTime="2025-10-02 07:01:15.583995887 +0000 UTC m=+885.705179018" Oct 02 07:01:15 crc kubenswrapper[4786]: I1002 07:01:15.600050 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" podStartSLOduration=4.600038457 podStartE2EDuration="4.600038457s" podCreationTimestamp="2025-10-02 07:01:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:15.597862373 +0000 UTC m=+885.719045524" watchObservedRunningTime="2025-10-02 07:01:15.600038457 +0000 UTC m=+885.721221588" Oct 02 07:01:16 crc kubenswrapper[4786]: I1002 07:01:16.554311 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46da85da-85a6-4ff7-8410-c26ccd99967e","Type":"ContainerStarted","Data":"313fb5372e8a5c77898d9f6cf7e8584a9bbf717ffb6c3da2aa1f19f15298ac97"} Oct 02 07:01:16 crc kubenswrapper[4786]: I1002 07:01:16.556992 4786 generic.go:334] "Generic (PLEG): container finished" podID="fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46" containerID="183be18201f0cbf05314f48a2d7db651df44eef2a3a372b53ef59fe0cb41c8ff" exitCode=0 Oct 02 07:01:16 crc kubenswrapper[4786]: I1002 07:01:16.557647 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lbkdj" event={"ID":"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46","Type":"ContainerDied","Data":"183be18201f0cbf05314f48a2d7db651df44eef2a3a372b53ef59fe0cb41c8ff"} Oct 02 07:01:16 crc kubenswrapper[4786]: I1002 07:01:16.558060 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c8addfa7-2397-45de-82ed-5932b165ca3b" containerName="glance-log" containerID="cri-o://638dfae83d23bca16b4ddecdb94531213c5e8a8c336828a77241d8a154686dff" gracePeriod=30 Oct 02 07:01:16 crc kubenswrapper[4786]: I1002 07:01:16.558291 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bb4539ca-be53-44b5-acce-23af3a79326a" containerName="glance-log" containerID="cri-o://dabfc4c5d3f544473787204ae8dcdc701226e8bcd47bc879d0ec079193f49b7f" gracePeriod=30 Oct 02 07:01:16 crc kubenswrapper[4786]: I1002 07:01:16.558364 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c8addfa7-2397-45de-82ed-5932b165ca3b" containerName="glance-httpd" containerID="cri-o://1f46485d4425674a740ad217df757381c6e327c0cb503d7e9cc2c12fbf8b983c" gracePeriod=30 Oct 02 07:01:16 crc kubenswrapper[4786]: I1002 07:01:16.558429 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bb4539ca-be53-44b5-acce-23af3a79326a" containerName="glance-httpd" containerID="cri-o://464f85c88afd658c379f80e7df4bf33038bc919b39ac8bb2175d6b8ac2dae355" gracePeriod=30 Oct 02 07:01:16 crc kubenswrapper[4786]: I1002 07:01:16.928633 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zt9q9" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.108433 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.121971 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cc1379e-f398-4974-a6c1-408d344ff49c-combined-ca-bundle\") pod \"5cc1379e-f398-4974-a6c1-408d344ff49c\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.122024 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cc1379e-f398-4974-a6c1-408d344ff49c-scripts\") pod \"5cc1379e-f398-4974-a6c1-408d344ff49c\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.122060 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8ncn\" (UniqueName: \"kubernetes.io/projected/5cc1379e-f398-4974-a6c1-408d344ff49c-kube-api-access-h8ncn\") pod \"5cc1379e-f398-4974-a6c1-408d344ff49c\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.122149 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cc1379e-f398-4974-a6c1-408d344ff49c-config-data\") pod \"5cc1379e-f398-4974-a6c1-408d344ff49c\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.122166 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cc1379e-f398-4974-a6c1-408d344ff49c-logs\") pod \"5cc1379e-f398-4974-a6c1-408d344ff49c\" (UID: \"5cc1379e-f398-4974-a6c1-408d344ff49c\") " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.123094 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cc1379e-f398-4974-a6c1-408d344ff49c-logs" (OuterVolumeSpecName: "logs") pod "5cc1379e-f398-4974-a6c1-408d344ff49c" (UID: "5cc1379e-f398-4974-a6c1-408d344ff49c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.131406 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cc1379e-f398-4974-a6c1-408d344ff49c-scripts" (OuterVolumeSpecName: "scripts") pod "5cc1379e-f398-4974-a6c1-408d344ff49c" (UID: "5cc1379e-f398-4974-a6c1-408d344ff49c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.143349 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cc1379e-f398-4974-a6c1-408d344ff49c-kube-api-access-h8ncn" (OuterVolumeSpecName: "kube-api-access-h8ncn") pod "5cc1379e-f398-4974-a6c1-408d344ff49c" (UID: "5cc1379e-f398-4974-a6c1-408d344ff49c"). InnerVolumeSpecName "kube-api-access-h8ncn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.145838 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cc1379e-f398-4974-a6c1-408d344ff49c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5cc1379e-f398-4974-a6c1-408d344ff49c" (UID: "5cc1379e-f398-4974-a6c1-408d344ff49c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.157923 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cc1379e-f398-4974-a6c1-408d344ff49c-config-data" (OuterVolumeSpecName: "config-data") pod "5cc1379e-f398-4974-a6c1-408d344ff49c" (UID: "5cc1379e-f398-4974-a6c1-408d344ff49c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.176453 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.223999 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8addfa7-2397-45de-82ed-5932b165ca3b-logs\") pod \"c8addfa7-2397-45de-82ed-5932b165ca3b\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.224066 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8addfa7-2397-45de-82ed-5932b165ca3b-scripts\") pod \"c8addfa7-2397-45de-82ed-5932b165ca3b\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.224091 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jqpz\" (UniqueName: \"kubernetes.io/projected/c8addfa7-2397-45de-82ed-5932b165ca3b-kube-api-access-9jqpz\") pod \"c8addfa7-2397-45de-82ed-5932b165ca3b\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.224116 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8addfa7-2397-45de-82ed-5932b165ca3b-combined-ca-bundle\") pod \"c8addfa7-2397-45de-82ed-5932b165ca3b\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.224188 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"c8addfa7-2397-45de-82ed-5932b165ca3b\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.224232 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8addfa7-2397-45de-82ed-5932b165ca3b-config-data\") pod \"c8addfa7-2397-45de-82ed-5932b165ca3b\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.224252 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8addfa7-2397-45de-82ed-5932b165ca3b-httpd-run\") pod \"c8addfa7-2397-45de-82ed-5932b165ca3b\" (UID: \"c8addfa7-2397-45de-82ed-5932b165ca3b\") " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.224516 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8addfa7-2397-45de-82ed-5932b165ca3b-logs" (OuterVolumeSpecName: "logs") pod "c8addfa7-2397-45de-82ed-5932b165ca3b" (UID: "c8addfa7-2397-45de-82ed-5932b165ca3b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.224782 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8addfa7-2397-45de-82ed-5932b165ca3b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c8addfa7-2397-45de-82ed-5932b165ca3b" (UID: "c8addfa7-2397-45de-82ed-5932b165ca3b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.225037 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8addfa7-2397-45de-82ed-5932b165ca3b-logs\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.225056 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cc1379e-f398-4974-a6c1-408d344ff49c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.225068 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5cc1379e-f398-4974-a6c1-408d344ff49c-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.225076 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8ncn\" (UniqueName: \"kubernetes.io/projected/5cc1379e-f398-4974-a6c1-408d344ff49c-kube-api-access-h8ncn\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.225085 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5cc1379e-f398-4974-a6c1-408d344ff49c-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.225093 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cc1379e-f398-4974-a6c1-408d344ff49c-logs\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.225100 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8addfa7-2397-45de-82ed-5932b165ca3b-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.226783 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "c8addfa7-2397-45de-82ed-5932b165ca3b" (UID: "c8addfa7-2397-45de-82ed-5932b165ca3b"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.227352 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8addfa7-2397-45de-82ed-5932b165ca3b-scripts" (OuterVolumeSpecName: "scripts") pod "c8addfa7-2397-45de-82ed-5932b165ca3b" (UID: "c8addfa7-2397-45de-82ed-5932b165ca3b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.227802 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8addfa7-2397-45de-82ed-5932b165ca3b-kube-api-access-9jqpz" (OuterVolumeSpecName: "kube-api-access-9jqpz") pod "c8addfa7-2397-45de-82ed-5932b165ca3b" (UID: "c8addfa7-2397-45de-82ed-5932b165ca3b"). InnerVolumeSpecName "kube-api-access-9jqpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.246327 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8addfa7-2397-45de-82ed-5932b165ca3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8addfa7-2397-45de-82ed-5932b165ca3b" (UID: "c8addfa7-2397-45de-82ed-5932b165ca3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.258119 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8addfa7-2397-45de-82ed-5932b165ca3b-config-data" (OuterVolumeSpecName: "config-data") pod "c8addfa7-2397-45de-82ed-5932b165ca3b" (UID: "c8addfa7-2397-45de-82ed-5932b165ca3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.327225 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb4539ca-be53-44b5-acce-23af3a79326a-httpd-run\") pod \"bb4539ca-be53-44b5-acce-23af3a79326a\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.327312 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb4539ca-be53-44b5-acce-23af3a79326a-logs\") pod \"bb4539ca-be53-44b5-acce-23af3a79326a\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.327342 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz7kj\" (UniqueName: \"kubernetes.io/projected/bb4539ca-be53-44b5-acce-23af3a79326a-kube-api-access-bz7kj\") pod \"bb4539ca-be53-44b5-acce-23af3a79326a\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.327369 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4539ca-be53-44b5-acce-23af3a79326a-config-data\") pod \"bb4539ca-be53-44b5-acce-23af3a79326a\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.327392 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4539ca-be53-44b5-acce-23af3a79326a-combined-ca-bundle\") pod \"bb4539ca-be53-44b5-acce-23af3a79326a\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.327410 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4539ca-be53-44b5-acce-23af3a79326a-scripts\") pod \"bb4539ca-be53-44b5-acce-23af3a79326a\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.327446 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"bb4539ca-be53-44b5-acce-23af3a79326a\" (UID: \"bb4539ca-be53-44b5-acce-23af3a79326a\") " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.327704 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb4539ca-be53-44b5-acce-23af3a79326a-logs" (OuterVolumeSpecName: "logs") pod "bb4539ca-be53-44b5-acce-23af3a79326a" (UID: "bb4539ca-be53-44b5-acce-23af3a79326a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.327766 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8addfa7-2397-45de-82ed-5932b165ca3b-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.327778 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8addfa7-2397-45de-82ed-5932b165ca3b-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.327787 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jqpz\" (UniqueName: \"kubernetes.io/projected/c8addfa7-2397-45de-82ed-5932b165ca3b-kube-api-access-9jqpz\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.327797 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8addfa7-2397-45de-82ed-5932b165ca3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.327815 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.327840 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb4539ca-be53-44b5-acce-23af3a79326a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bb4539ca-be53-44b5-acce-23af3a79326a" (UID: "bb4539ca-be53-44b5-acce-23af3a79326a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.329719 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb4539ca-be53-44b5-acce-23af3a79326a-kube-api-access-bz7kj" (OuterVolumeSpecName: "kube-api-access-bz7kj") pod "bb4539ca-be53-44b5-acce-23af3a79326a" (UID: "bb4539ca-be53-44b5-acce-23af3a79326a"). InnerVolumeSpecName "kube-api-access-bz7kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.330585 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4539ca-be53-44b5-acce-23af3a79326a-scripts" (OuterVolumeSpecName: "scripts") pod "bb4539ca-be53-44b5-acce-23af3a79326a" (UID: "bb4539ca-be53-44b5-acce-23af3a79326a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.332119 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "bb4539ca-be53-44b5-acce-23af3a79326a" (UID: "bb4539ca-be53-44b5-acce-23af3a79326a"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.342337 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.346252 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4539ca-be53-44b5-acce-23af3a79326a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb4539ca-be53-44b5-acce-23af3a79326a" (UID: "bb4539ca-be53-44b5-acce-23af3a79326a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.360516 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4539ca-be53-44b5-acce-23af3a79326a-config-data" (OuterVolumeSpecName: "config-data") pod "bb4539ca-be53-44b5-acce-23af3a79326a" (UID: "bb4539ca-be53-44b5-acce-23af3a79326a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.428932 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bb4539ca-be53-44b5-acce-23af3a79326a-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.428957 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.428966 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb4539ca-be53-44b5-acce-23af3a79326a-logs\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.428975 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz7kj\" (UniqueName: \"kubernetes.io/projected/bb4539ca-be53-44b5-acce-23af3a79326a-kube-api-access-bz7kj\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.428986 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4539ca-be53-44b5-acce-23af3a79326a-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.428993 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4539ca-be53-44b5-acce-23af3a79326a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.429000 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4539ca-be53-44b5-acce-23af3a79326a-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.429028 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.443603 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.530904 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.582860 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zt9q9" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.583838 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zt9q9" event={"ID":"5cc1379e-f398-4974-a6c1-408d344ff49c","Type":"ContainerDied","Data":"698b1b99601477959d4e0871399c9f34e364017b49a51fee2958545a571cce25"} Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.583865 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="698b1b99601477959d4e0871399c9f34e364017b49a51fee2958545a571cce25" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.594202 4786 generic.go:334] "Generic (PLEG): container finished" podID="bb4539ca-be53-44b5-acce-23af3a79326a" containerID="464f85c88afd658c379f80e7df4bf33038bc919b39ac8bb2175d6b8ac2dae355" exitCode=0 Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.594225 4786 generic.go:334] "Generic (PLEG): container finished" podID="bb4539ca-be53-44b5-acce-23af3a79326a" containerID="dabfc4c5d3f544473787204ae8dcdc701226e8bcd47bc879d0ec079193f49b7f" exitCode=143 Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.594279 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb4539ca-be53-44b5-acce-23af3a79326a","Type":"ContainerDied","Data":"464f85c88afd658c379f80e7df4bf33038bc919b39ac8bb2175d6b8ac2dae355"} Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.594298 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb4539ca-be53-44b5-acce-23af3a79326a","Type":"ContainerDied","Data":"dabfc4c5d3f544473787204ae8dcdc701226e8bcd47bc879d0ec079193f49b7f"} Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.594307 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bb4539ca-be53-44b5-acce-23af3a79326a","Type":"ContainerDied","Data":"a89a7474884ec3a85339a4160f8c8d7225967d5a602e00e5d21bceed96b77dc1"} Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.594320 4786 scope.go:117] "RemoveContainer" containerID="464f85c88afd658c379f80e7df4bf33038bc919b39ac8bb2175d6b8ac2dae355" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.594448 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.607641 4786 generic.go:334] "Generic (PLEG): container finished" podID="c8addfa7-2397-45de-82ed-5932b165ca3b" containerID="1f46485d4425674a740ad217df757381c6e327c0cb503d7e9cc2c12fbf8b983c" exitCode=0 Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.607669 4786 generic.go:334] "Generic (PLEG): container finished" podID="c8addfa7-2397-45de-82ed-5932b165ca3b" containerID="638dfae83d23bca16b4ddecdb94531213c5e8a8c336828a77241d8a154686dff" exitCode=143 Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.607880 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.609601 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c8addfa7-2397-45de-82ed-5932b165ca3b","Type":"ContainerDied","Data":"1f46485d4425674a740ad217df757381c6e327c0cb503d7e9cc2c12fbf8b983c"} Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.609644 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c8addfa7-2397-45de-82ed-5932b165ca3b","Type":"ContainerDied","Data":"638dfae83d23bca16b4ddecdb94531213c5e8a8c336828a77241d8a154686dff"} Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.609655 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c8addfa7-2397-45de-82ed-5932b165ca3b","Type":"ContainerDied","Data":"a705b85dce79b60540ac965384f3c6646f500ac06e64828b5ff8d253419180bd"} Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.614926 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8c555dcd8-dbpk5"] Oct 02 07:01:17 crc kubenswrapper[4786]: E1002 07:01:17.616313 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8addfa7-2397-45de-82ed-5932b165ca3b" containerName="glance-log" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.616337 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8addfa7-2397-45de-82ed-5932b165ca3b" containerName="glance-log" Oct 02 07:01:17 crc kubenswrapper[4786]: E1002 07:01:17.616346 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d8adabb-6397-4678-b0de-87c2b7817f69" containerName="dnsmasq-dns" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.616351 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d8adabb-6397-4678-b0de-87c2b7817f69" containerName="dnsmasq-dns" Oct 02 07:01:17 crc kubenswrapper[4786]: E1002 07:01:17.616367 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d8adabb-6397-4678-b0de-87c2b7817f69" containerName="init" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.616373 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d8adabb-6397-4678-b0de-87c2b7817f69" containerName="init" Oct 02 07:01:17 crc kubenswrapper[4786]: E1002 07:01:17.616381 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cc1379e-f398-4974-a6c1-408d344ff49c" containerName="placement-db-sync" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.616393 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cc1379e-f398-4974-a6c1-408d344ff49c" containerName="placement-db-sync" Oct 02 07:01:17 crc kubenswrapper[4786]: E1002 07:01:17.616409 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8addfa7-2397-45de-82ed-5932b165ca3b" containerName="glance-httpd" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.616415 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8addfa7-2397-45de-82ed-5932b165ca3b" containerName="glance-httpd" Oct 02 07:01:17 crc kubenswrapper[4786]: E1002 07:01:17.616443 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb4539ca-be53-44b5-acce-23af3a79326a" containerName="glance-httpd" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.616448 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb4539ca-be53-44b5-acce-23af3a79326a" containerName="glance-httpd" Oct 02 07:01:17 crc kubenswrapper[4786]: E1002 07:01:17.616458 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb4539ca-be53-44b5-acce-23af3a79326a" containerName="glance-log" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.616463 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb4539ca-be53-44b5-acce-23af3a79326a" containerName="glance-log" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.616623 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb4539ca-be53-44b5-acce-23af3a79326a" containerName="glance-httpd" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.616637 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8addfa7-2397-45de-82ed-5932b165ca3b" containerName="glance-httpd" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.616648 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d8adabb-6397-4678-b0de-87c2b7817f69" containerName="dnsmasq-dns" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.616660 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb4539ca-be53-44b5-acce-23af3a79326a" containerName="glance-log" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.616669 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cc1379e-f398-4974-a6c1-408d344ff49c" containerName="placement-db-sync" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.616681 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8addfa7-2397-45de-82ed-5932b165ca3b" containerName="glance-log" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.617471 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.620681 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.620874 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.625853 4786 scope.go:117] "RemoveContainer" containerID="dabfc4c5d3f544473787204ae8dcdc701226e8bcd47bc879d0ec079193f49b7f" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.635111 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8c555dcd8-dbpk5"] Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.642482 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-mqwps" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.643324 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.644801 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.674404 4786 scope.go:117] "RemoveContainer" containerID="464f85c88afd658c379f80e7df4bf33038bc919b39ac8bb2175d6b8ac2dae355" Oct 02 07:01:17 crc kubenswrapper[4786]: E1002 07:01:17.675413 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"464f85c88afd658c379f80e7df4bf33038bc919b39ac8bb2175d6b8ac2dae355\": container with ID starting with 464f85c88afd658c379f80e7df4bf33038bc919b39ac8bb2175d6b8ac2dae355 not found: ID does not exist" containerID="464f85c88afd658c379f80e7df4bf33038bc919b39ac8bb2175d6b8ac2dae355" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.675600 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"464f85c88afd658c379f80e7df4bf33038bc919b39ac8bb2175d6b8ac2dae355"} err="failed to get container status \"464f85c88afd658c379f80e7df4bf33038bc919b39ac8bb2175d6b8ac2dae355\": rpc error: code = NotFound desc = could not find container \"464f85c88afd658c379f80e7df4bf33038bc919b39ac8bb2175d6b8ac2dae355\": container with ID starting with 464f85c88afd658c379f80e7df4bf33038bc919b39ac8bb2175d6b8ac2dae355 not found: ID does not exist" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.675626 4786 scope.go:117] "RemoveContainer" containerID="dabfc4c5d3f544473787204ae8dcdc701226e8bcd47bc879d0ec079193f49b7f" Oct 02 07:01:17 crc kubenswrapper[4786]: E1002 07:01:17.675940 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dabfc4c5d3f544473787204ae8dcdc701226e8bcd47bc879d0ec079193f49b7f\": container with ID starting with dabfc4c5d3f544473787204ae8dcdc701226e8bcd47bc879d0ec079193f49b7f not found: ID does not exist" containerID="dabfc4c5d3f544473787204ae8dcdc701226e8bcd47bc879d0ec079193f49b7f" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.675965 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dabfc4c5d3f544473787204ae8dcdc701226e8bcd47bc879d0ec079193f49b7f"} err="failed to get container status \"dabfc4c5d3f544473787204ae8dcdc701226e8bcd47bc879d0ec079193f49b7f\": rpc error: code = NotFound desc = could not find container \"dabfc4c5d3f544473787204ae8dcdc701226e8bcd47bc879d0ec079193f49b7f\": container with ID starting with dabfc4c5d3f544473787204ae8dcdc701226e8bcd47bc879d0ec079193f49b7f not found: ID does not exist" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.675980 4786 scope.go:117] "RemoveContainer" containerID="464f85c88afd658c379f80e7df4bf33038bc919b39ac8bb2175d6b8ac2dae355" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.677801 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"464f85c88afd658c379f80e7df4bf33038bc919b39ac8bb2175d6b8ac2dae355"} err="failed to get container status \"464f85c88afd658c379f80e7df4bf33038bc919b39ac8bb2175d6b8ac2dae355\": rpc error: code = NotFound desc = could not find container \"464f85c88afd658c379f80e7df4bf33038bc919b39ac8bb2175d6b8ac2dae355\": container with ID starting with 464f85c88afd658c379f80e7df4bf33038bc919b39ac8bb2175d6b8ac2dae355 not found: ID does not exist" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.677860 4786 scope.go:117] "RemoveContainer" containerID="dabfc4c5d3f544473787204ae8dcdc701226e8bcd47bc879d0ec079193f49b7f" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.679807 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dabfc4c5d3f544473787204ae8dcdc701226e8bcd47bc879d0ec079193f49b7f"} err="failed to get container status \"dabfc4c5d3f544473787204ae8dcdc701226e8bcd47bc879d0ec079193f49b7f\": rpc error: code = NotFound desc = could not find container \"dabfc4c5d3f544473787204ae8dcdc701226e8bcd47bc879d0ec079193f49b7f\": container with ID starting with dabfc4c5d3f544473787204ae8dcdc701226e8bcd47bc879d0ec079193f49b7f not found: ID does not exist" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.679832 4786 scope.go:117] "RemoveContainer" containerID="1f46485d4425674a740ad217df757381c6e327c0cb503d7e9cc2c12fbf8b983c" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.681667 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.705463 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.713359 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.722068 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.723304 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.725119 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.727403 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-8rkqb" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.727632 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.727859 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.727970 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.731846 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.733185 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.736255 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.736461 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.737231 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-combined-ca-bundle\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.737252 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.737285 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-internal-tls-certs\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.737512 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-scripts\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.737546 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-logs\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.741329 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-config-data\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.741376 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-public-tls-certs\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.741407 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thdgg\" (UniqueName: \"kubernetes.io/projected/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-kube-api-access-thdgg\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.745080 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.777592 4786 scope.go:117] "RemoveContainer" containerID="638dfae83d23bca16b4ddecdb94531213c5e8a8c336828a77241d8a154686dff" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.794952 4786 scope.go:117] "RemoveContainer" containerID="1f46485d4425674a740ad217df757381c6e327c0cb503d7e9cc2c12fbf8b983c" Oct 02 07:01:17 crc kubenswrapper[4786]: E1002 07:01:17.795606 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f46485d4425674a740ad217df757381c6e327c0cb503d7e9cc2c12fbf8b983c\": container with ID starting with 1f46485d4425674a740ad217df757381c6e327c0cb503d7e9cc2c12fbf8b983c not found: ID does not exist" containerID="1f46485d4425674a740ad217df757381c6e327c0cb503d7e9cc2c12fbf8b983c" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.795665 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f46485d4425674a740ad217df757381c6e327c0cb503d7e9cc2c12fbf8b983c"} err="failed to get container status \"1f46485d4425674a740ad217df757381c6e327c0cb503d7e9cc2c12fbf8b983c\": rpc error: code = NotFound desc = could not find container \"1f46485d4425674a740ad217df757381c6e327c0cb503d7e9cc2c12fbf8b983c\": container with ID starting with 1f46485d4425674a740ad217df757381c6e327c0cb503d7e9cc2c12fbf8b983c not found: ID does not exist" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.795705 4786 scope.go:117] "RemoveContainer" containerID="638dfae83d23bca16b4ddecdb94531213c5e8a8c336828a77241d8a154686dff" Oct 02 07:01:17 crc kubenswrapper[4786]: E1002 07:01:17.796088 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"638dfae83d23bca16b4ddecdb94531213c5e8a8c336828a77241d8a154686dff\": container with ID starting with 638dfae83d23bca16b4ddecdb94531213c5e8a8c336828a77241d8a154686dff not found: ID does not exist" containerID="638dfae83d23bca16b4ddecdb94531213c5e8a8c336828a77241d8a154686dff" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.796137 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638dfae83d23bca16b4ddecdb94531213c5e8a8c336828a77241d8a154686dff"} err="failed to get container status \"638dfae83d23bca16b4ddecdb94531213c5e8a8c336828a77241d8a154686dff\": rpc error: code = NotFound desc = could not find container \"638dfae83d23bca16b4ddecdb94531213c5e8a8c336828a77241d8a154686dff\": container with ID starting with 638dfae83d23bca16b4ddecdb94531213c5e8a8c336828a77241d8a154686dff not found: ID does not exist" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.796160 4786 scope.go:117] "RemoveContainer" containerID="1f46485d4425674a740ad217df757381c6e327c0cb503d7e9cc2c12fbf8b983c" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.796342 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f46485d4425674a740ad217df757381c6e327c0cb503d7e9cc2c12fbf8b983c"} err="failed to get container status \"1f46485d4425674a740ad217df757381c6e327c0cb503d7e9cc2c12fbf8b983c\": rpc error: code = NotFound desc = could not find container \"1f46485d4425674a740ad217df757381c6e327c0cb503d7e9cc2c12fbf8b983c\": container with ID starting with 1f46485d4425674a740ad217df757381c6e327c0cb503d7e9cc2c12fbf8b983c not found: ID does not exist" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.796361 4786 scope.go:117] "RemoveContainer" containerID="638dfae83d23bca16b4ddecdb94531213c5e8a8c336828a77241d8a154686dff" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.796682 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638dfae83d23bca16b4ddecdb94531213c5e8a8c336828a77241d8a154686dff"} err="failed to get container status \"638dfae83d23bca16b4ddecdb94531213c5e8a8c336828a77241d8a154686dff\": rpc error: code = NotFound desc = could not find container \"638dfae83d23bca16b4ddecdb94531213c5e8a8c336828a77241d8a154686dff\": container with ID starting with 638dfae83d23bca16b4ddecdb94531213c5e8a8c336828a77241d8a154686dff not found: ID does not exist" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842377 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thdgg\" (UniqueName: \"kubernetes.io/projected/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-kube-api-access-thdgg\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842438 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842499 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842532 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-combined-ca-bundle\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842560 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-scripts\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842594 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842607 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842624 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842662 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-internal-tls-certs\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842677 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-scripts\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842713 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-logs\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842735 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt46b\" (UniqueName: \"kubernetes.io/projected/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-kube-api-access-nt46b\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842750 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-logs\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842763 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwwcv\" (UniqueName: \"kubernetes.io/projected/03227959-451d-483a-8d46-182fa634d20d-kube-api-access-kwwcv\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842783 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-config-data\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842808 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842823 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03227959-451d-483a-8d46-182fa634d20d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842836 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842861 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-public-tls-certs\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842875 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842890 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-config-data\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842910 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.842929 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03227959-451d-483a-8d46-182fa634d20d-logs\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.843800 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-logs\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.850892 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-combined-ca-bundle\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.851120 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-config-data\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.851194 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-internal-tls-certs\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.853403 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-public-tls-certs\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.860344 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thdgg\" (UniqueName: \"kubernetes.io/projected/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-kube-api-access-thdgg\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.860580 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07a06aef-d1a6-4093-ac4e-d6aa3ded6b60-scripts\") pod \"placement-8c555dcd8-dbpk5\" (UID: \"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60\") " pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.940722 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.944107 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-logs\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.944153 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt46b\" (UniqueName: \"kubernetes.io/projected/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-kube-api-access-nt46b\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.944171 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwwcv\" (UniqueName: \"kubernetes.io/projected/03227959-451d-483a-8d46-182fa634d20d-kube-api-access-kwwcv\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.944207 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.944224 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.944239 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03227959-451d-483a-8d46-182fa634d20d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.944265 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.944282 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-config-data\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.944298 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.944318 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03227959-451d-483a-8d46-182fa634d20d-logs\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.944348 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.944395 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.944447 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-scripts\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.944493 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.944510 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.944529 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.945459 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.945749 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.945995 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03227959-451d-483a-8d46-182fa634d20d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.946040 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-logs\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.948123 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03227959-451d-483a-8d46-182fa634d20d-logs\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.948337 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.948765 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-scripts\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.956855 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.957612 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.958325 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.960270 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.960385 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.962933 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt46b\" (UniqueName: \"kubernetes.io/projected/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-kube-api-access-nt46b\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.962971 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.963193 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwwcv\" (UniqueName: \"kubernetes.io/projected/03227959-451d-483a-8d46-182fa634d20d-kube-api-access-kwwcv\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.966244 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-config-data\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.984048 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:01:17 crc kubenswrapper[4786]: I1002 07:01:17.986085 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " pod="openstack/glance-default-external-api-0" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.004176 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.056824 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.071847 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.147510 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-config-data\") pod \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.147641 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxf6z\" (UniqueName: \"kubernetes.io/projected/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-kube-api-access-kxf6z\") pod \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.147678 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-combined-ca-bundle\") pod \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.147834 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-credential-keys\") pod \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.147899 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-fernet-keys\") pod \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.147923 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-scripts\") pod \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\" (UID: \"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46\") " Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.151230 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-kube-api-access-kxf6z" (OuterVolumeSpecName: "kube-api-access-kxf6z") pod "fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46" (UID: "fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46"). InnerVolumeSpecName "kube-api-access-kxf6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.151522 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-scripts" (OuterVolumeSpecName: "scripts") pod "fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46" (UID: "fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.152499 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46" (UID: "fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.153821 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46" (UID: "fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.167240 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-config-data" (OuterVolumeSpecName: "config-data") pod "fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46" (UID: "fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.171366 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46" (UID: "fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.191202 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb4539ca-be53-44b5-acce-23af3a79326a" path="/var/lib/kubelet/pods/bb4539ca-be53-44b5-acce-23af3a79326a/volumes" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.191916 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8addfa7-2397-45de-82ed-5932b165ca3b" path="/var/lib/kubelet/pods/c8addfa7-2397-45de-82ed-5932b165ca3b/volumes" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.249452 4786 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.249478 4786 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.249488 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.249497 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.249505 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxf6z\" (UniqueName: \"kubernetes.io/projected/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-kube-api-access-kxf6z\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.249513 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.616320 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lbkdj" event={"ID":"fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46","Type":"ContainerDied","Data":"8949edb9879b527e6e9d99488e52eb00ff63895692feb5803e2b24296d66e9c6"} Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.616584 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8949edb9879b527e6e9d99488e52eb00ff63895692feb5803e2b24296d66e9c6" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.616334 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lbkdj" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.711863 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-749d95758-7hp9f"] Oct 02 07:01:18 crc kubenswrapper[4786]: E1002 07:01:18.712458 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46" containerName="keystone-bootstrap" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.712482 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46" containerName="keystone-bootstrap" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.712653 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46" containerName="keystone-bootstrap" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.716453 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.718188 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.719064 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-749d95758-7hp9f"] Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.719138 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.719189 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.719193 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.719253 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v5blg" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.719340 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.861717 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-scripts\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.861756 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-combined-ca-bundle\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.861784 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-fernet-keys\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.861952 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-config-data\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.861994 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-internal-tls-certs\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.862075 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-credential-keys\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.862164 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-public-tls-certs\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.862304 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zdtb\" (UniqueName: \"kubernetes.io/projected/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-kube-api-access-6zdtb\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.964642 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-scripts\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.964781 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-combined-ca-bundle\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.964993 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-fernet-keys\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.965151 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-config-data\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.965220 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-internal-tls-certs\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.965277 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-credential-keys\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.965532 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-public-tls-certs\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.965735 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zdtb\" (UniqueName: \"kubernetes.io/projected/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-kube-api-access-6zdtb\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.968861 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-credential-keys\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.968877 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-internal-tls-certs\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.969097 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-scripts\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.969966 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-fernet-keys\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.970035 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-combined-ca-bundle\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.970921 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-public-tls-certs\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.976063 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-config-data\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:18 crc kubenswrapper[4786]: I1002 07:01:18.978133 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zdtb\" (UniqueName: \"kubernetes.io/projected/1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7-kube-api-access-6zdtb\") pod \"keystone-749d95758-7hp9f\" (UID: \"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7\") " pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:19 crc kubenswrapper[4786]: I1002 07:01:19.032174 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:19 crc kubenswrapper[4786]: I1002 07:01:19.627858 4786 generic.go:334] "Generic (PLEG): container finished" podID="ca674644-5646-452d-ba2a-5ff2844f64ea" containerID="2180bad403aa120ca73eb4c7fc08aaad33410b1a1b15857057521c74dbd589cc" exitCode=0 Oct 02 07:01:20 crc kubenswrapper[4786]: I1002 07:01:19.627938 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-f42fq" event={"ID":"ca674644-5646-452d-ba2a-5ff2844f64ea","Type":"ContainerDied","Data":"2180bad403aa120ca73eb4c7fc08aaad33410b1a1b15857057521c74dbd589cc"} Oct 02 07:01:21 crc kubenswrapper[4786]: I1002 07:01:21.251416 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8c555dcd8-dbpk5"] Oct 02 07:01:22 crc kubenswrapper[4786]: I1002 07:01:22.032819 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:22 crc kubenswrapper[4786]: I1002 07:01:22.077407 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85f4f889b9-jz4kr"] Oct 02 07:01:22 crc kubenswrapper[4786]: I1002 07:01:22.077608 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" podUID="0b02fae5-2435-4676-983f-f60bcd4d58a0" containerName="dnsmasq-dns" containerID="cri-o://deac9b38b0b84db27e9114e0a72b9105f44b124b581466156bd7702ba72d917a" gracePeriod=10 Oct 02 07:01:22 crc kubenswrapper[4786]: I1002 07:01:22.647083 4786 generic.go:334] "Generic (PLEG): container finished" podID="0b02fae5-2435-4676-983f-f60bcd4d58a0" containerID="deac9b38b0b84db27e9114e0a72b9105f44b124b581466156bd7702ba72d917a" exitCode=0 Oct 02 07:01:22 crc kubenswrapper[4786]: I1002 07:01:22.647148 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" event={"ID":"0b02fae5-2435-4676-983f-f60bcd4d58a0","Type":"ContainerDied","Data":"deac9b38b0b84db27e9114e0a72b9105f44b124b581466156bd7702ba72d917a"} Oct 02 07:01:23 crc kubenswrapper[4786]: I1002 07:01:23.596509 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" podUID="0b02fae5-2435-4676-983f-f60bcd4d58a0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.118:5353: connect: connection refused" Oct 02 07:01:23 crc kubenswrapper[4786]: I1002 07:01:23.882987 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-f42fq" Oct 02 07:01:23 crc kubenswrapper[4786]: I1002 07:01:23.930366 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.057341 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-dns-svc\") pod \"0b02fae5-2435-4676-983f-f60bcd4d58a0\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.057756 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-config\") pod \"0b02fae5-2435-4676-983f-f60bcd4d58a0\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.057807 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-ovsdbserver-sb\") pod \"0b02fae5-2435-4676-983f-f60bcd4d58a0\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.057861 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-ovsdbserver-nb\") pod \"0b02fae5-2435-4676-983f-f60bcd4d58a0\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.057919 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wgzs\" (UniqueName: \"kubernetes.io/projected/0b02fae5-2435-4676-983f-f60bcd4d58a0-kube-api-access-7wgzs\") pod \"0b02fae5-2435-4676-983f-f60bcd4d58a0\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.057951 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-dns-swift-storage-0\") pod \"0b02fae5-2435-4676-983f-f60bcd4d58a0\" (UID: \"0b02fae5-2435-4676-983f-f60bcd4d58a0\") " Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.057997 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca674644-5646-452d-ba2a-5ff2844f64ea-config\") pod \"ca674644-5646-452d-ba2a-5ff2844f64ea\" (UID: \"ca674644-5646-452d-ba2a-5ff2844f64ea\") " Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.058037 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca674644-5646-452d-ba2a-5ff2844f64ea-combined-ca-bundle\") pod \"ca674644-5646-452d-ba2a-5ff2844f64ea\" (UID: \"ca674644-5646-452d-ba2a-5ff2844f64ea\") " Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.058064 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n7xf\" (UniqueName: \"kubernetes.io/projected/ca674644-5646-452d-ba2a-5ff2844f64ea-kube-api-access-5n7xf\") pod \"ca674644-5646-452d-ba2a-5ff2844f64ea\" (UID: \"ca674644-5646-452d-ba2a-5ff2844f64ea\") " Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.061530 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b02fae5-2435-4676-983f-f60bcd4d58a0-kube-api-access-7wgzs" (OuterVolumeSpecName: "kube-api-access-7wgzs") pod "0b02fae5-2435-4676-983f-f60bcd4d58a0" (UID: "0b02fae5-2435-4676-983f-f60bcd4d58a0"). InnerVolumeSpecName "kube-api-access-7wgzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.063815 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca674644-5646-452d-ba2a-5ff2844f64ea-kube-api-access-5n7xf" (OuterVolumeSpecName: "kube-api-access-5n7xf") pod "ca674644-5646-452d-ba2a-5ff2844f64ea" (UID: "ca674644-5646-452d-ba2a-5ff2844f64ea"). InnerVolumeSpecName "kube-api-access-5n7xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.079751 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca674644-5646-452d-ba2a-5ff2844f64ea-config" (OuterVolumeSpecName: "config") pod "ca674644-5646-452d-ba2a-5ff2844f64ea" (UID: "ca674644-5646-452d-ba2a-5ff2844f64ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.093526 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0b02fae5-2435-4676-983f-f60bcd4d58a0" (UID: "0b02fae5-2435-4676-983f-f60bcd4d58a0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.093623 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca674644-5646-452d-ba2a-5ff2844f64ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca674644-5646-452d-ba2a-5ff2844f64ea" (UID: "ca674644-5646-452d-ba2a-5ff2844f64ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.097301 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0b02fae5-2435-4676-983f-f60bcd4d58a0" (UID: "0b02fae5-2435-4676-983f-f60bcd4d58a0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.097351 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0b02fae5-2435-4676-983f-f60bcd4d58a0" (UID: "0b02fae5-2435-4676-983f-f60bcd4d58a0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.098004 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0b02fae5-2435-4676-983f-f60bcd4d58a0" (UID: "0b02fae5-2435-4676-983f-f60bcd4d58a0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.110254 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-config" (OuterVolumeSpecName: "config") pod "0b02fae5-2435-4676-983f-f60bcd4d58a0" (UID: "0b02fae5-2435-4676-983f-f60bcd4d58a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.128106 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-749d95758-7hp9f"] Oct 02 07:01:24 crc kubenswrapper[4786]: W1002 07:01:24.129192 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c71cbf2_87c2_4bd2_a6c9_bd641976f8d7.slice/crio-8a090902c6665ca0510f9c2474991143d21e43ac61bbc07fc8f56118e9da895f WatchSource:0}: Error finding container 8a090902c6665ca0510f9c2474991143d21e43ac61bbc07fc8f56118e9da895f: Status 404 returned error can't find the container with id 8a090902c6665ca0510f9c2474991143d21e43ac61bbc07fc8f56118e9da895f Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.160072 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.160100 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.160110 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.160120 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.160128 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wgzs\" (UniqueName: \"kubernetes.io/projected/0b02fae5-2435-4676-983f-f60bcd4d58a0-kube-api-access-7wgzs\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.160137 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0b02fae5-2435-4676-983f-f60bcd4d58a0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.160144 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca674644-5646-452d-ba2a-5ff2844f64ea-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.160151 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca674644-5646-452d-ba2a-5ff2844f64ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.160159 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n7xf\" (UniqueName: \"kubernetes.io/projected/ca674644-5646-452d-ba2a-5ff2844f64ea-kube-api-access-5n7xf\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.188013 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 07:01:24 crc kubenswrapper[4786]: W1002 07:01:24.194211 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03227959_451d_483a_8d46_182fa634d20d.slice/crio-03c4336f0ba0a13b4678678f78b7d2ef72ba0caf7b06a977e32559d4d741026e WatchSource:0}: Error finding container 03c4336f0ba0a13b4678678f78b7d2ef72ba0caf7b06a977e32559d4d741026e: Status 404 returned error can't find the container with id 03c4336f0ba0a13b4678678f78b7d2ef72ba0caf7b06a977e32559d4d741026e Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.316286 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.674111 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9","Type":"ContainerStarted","Data":"51b0574098bf921d91f6e77111fa094dc422f2f831edd09a79a93ce1eb8e4898"} Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.675939 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-749d95758-7hp9f" event={"ID":"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7","Type":"ContainerStarted","Data":"97416d7148a089e079528ba31f7debc73b30b68e5cc11b2bb1a652e09b392383"} Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.675964 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-749d95758-7hp9f" event={"ID":"1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7","Type":"ContainerStarted","Data":"8a090902c6665ca0510f9c2474991143d21e43ac61bbc07fc8f56118e9da895f"} Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.676057 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.678332 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" event={"ID":"0b02fae5-2435-4676-983f-f60bcd4d58a0","Type":"ContainerDied","Data":"250afb4ced759ceb8348c87ff83bfabec6918a9abc8f8320bcae4e24fe034489"} Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.678350 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f4f889b9-jz4kr" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.678362 4786 scope.go:117] "RemoveContainer" containerID="deac9b38b0b84db27e9114e0a72b9105f44b124b581466156bd7702ba72d917a" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.680141 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-f42fq" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.680167 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-f42fq" event={"ID":"ca674644-5646-452d-ba2a-5ff2844f64ea","Type":"ContainerDied","Data":"0904eb1b933d3d8996f538e62e7134a30cfcaf2f76d4010a3246e6fb48258819"} Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.680188 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0904eb1b933d3d8996f538e62e7134a30cfcaf2f76d4010a3246e6fb48258819" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.682613 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8c555dcd8-dbpk5" event={"ID":"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60","Type":"ContainerStarted","Data":"4f01d93bc39c75ca20268868dbfcf66f23dc4ddd542500bf2dad7ad18d603f2f"} Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.682636 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8c555dcd8-dbpk5" event={"ID":"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60","Type":"ContainerStarted","Data":"d9b80a243367ad7bceb31f811a9a3f1bddaadb6b5895ad554c6d94cf8f63d1c5"} Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.682647 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8c555dcd8-dbpk5" event={"ID":"07a06aef-d1a6-4093-ac4e-d6aa3ded6b60","Type":"ContainerStarted","Data":"4c532bad6c09b92cffca3e3c3bf6482ebd9bc459e3daa364f71226c9ccadb510"} Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.682674 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.682758 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.684368 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46da85da-85a6-4ff7-8410-c26ccd99967e","Type":"ContainerStarted","Data":"d725fe44277f0a83e913ed067f7f05ca9e57cadd8fdd80931e59e56ef7973a73"} Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.685795 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xt8ms" event={"ID":"9f7b97c5-b382-4fdf-bb25-b384b16eb1f6","Type":"ContainerStarted","Data":"f239a343a449fdf0fa68f6585f065e334559dde91bd48427401248d6a8585064"} Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.687084 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"03227959-451d-483a-8d46-182fa634d20d","Type":"ContainerStarted","Data":"03c4336f0ba0a13b4678678f78b7d2ef72ba0caf7b06a977e32559d4d741026e"} Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.691614 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-749d95758-7hp9f" podStartSLOduration=6.6916052950000005 podStartE2EDuration="6.691605295s" podCreationTimestamp="2025-10-02 07:01:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:24.6903773 +0000 UTC m=+894.811560441" watchObservedRunningTime="2025-10-02 07:01:24.691605295 +0000 UTC m=+894.812788426" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.708478 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-xt8ms" podStartSLOduration=2.7186573960000002 podStartE2EDuration="12.70846625s" podCreationTimestamp="2025-10-02 07:01:12 +0000 UTC" firstStartedPulling="2025-10-02 07:01:13.749390949 +0000 UTC m=+883.870574081" lastFinishedPulling="2025-10-02 07:01:23.739199804 +0000 UTC m=+893.860382935" observedRunningTime="2025-10-02 07:01:24.706613134 +0000 UTC m=+894.827796265" watchObservedRunningTime="2025-10-02 07:01:24.70846625 +0000 UTC m=+894.829649381" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.724202 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8c555dcd8-dbpk5" podStartSLOduration=7.724189047 podStartE2EDuration="7.724189047s" podCreationTimestamp="2025-10-02 07:01:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:24.720523032 +0000 UTC m=+894.841706173" watchObservedRunningTime="2025-10-02 07:01:24.724189047 +0000 UTC m=+894.845372179" Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.762232 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85f4f889b9-jz4kr"] Oct 02 07:01:24 crc kubenswrapper[4786]: I1002 07:01:24.767585 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85f4f889b9-jz4kr"] Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.016276 4786 scope.go:117] "RemoveContainer" containerID="6d70055c5489d28004e433bdf8256a857257c5f80b1f0b521ba0a02f1ed82edd" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.047680 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb"] Oct 02 07:01:25 crc kubenswrapper[4786]: E1002 07:01:25.048106 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b02fae5-2435-4676-983f-f60bcd4d58a0" containerName="init" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.048122 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b02fae5-2435-4676-983f-f60bcd4d58a0" containerName="init" Oct 02 07:01:25 crc kubenswrapper[4786]: E1002 07:01:25.048131 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b02fae5-2435-4676-983f-f60bcd4d58a0" containerName="dnsmasq-dns" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.048136 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b02fae5-2435-4676-983f-f60bcd4d58a0" containerName="dnsmasq-dns" Oct 02 07:01:25 crc kubenswrapper[4786]: E1002 07:01:25.048153 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca674644-5646-452d-ba2a-5ff2844f64ea" containerName="neutron-db-sync" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.048159 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca674644-5646-452d-ba2a-5ff2844f64ea" containerName="neutron-db-sync" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.048355 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b02fae5-2435-4676-983f-f60bcd4d58a0" containerName="dnsmasq-dns" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.048379 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca674644-5646-452d-ba2a-5ff2844f64ea" containerName="neutron-db-sync" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.049370 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.074903 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb"] Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.178931 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-dns-swift-storage-0\") pod \"dnsmasq-dns-6cb8bc4c8f-w7dhb\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.179549 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-config\") pod \"dnsmasq-dns-6cb8bc4c8f-w7dhb\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.179591 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-dns-svc\") pod \"dnsmasq-dns-6cb8bc4c8f-w7dhb\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.179611 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb8bc4c8f-w7dhb\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.179654 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k6hq\" (UniqueName: \"kubernetes.io/projected/d62f273f-e2ef-4c5d-a319-1f38ee22d823-kube-api-access-7k6hq\") pod \"dnsmasq-dns-6cb8bc4c8f-w7dhb\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.179778 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb8bc4c8f-w7dhb\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.194103 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-87d5477fb-vgjcn"] Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.195438 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.197046 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.198663 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.198976 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.199014 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-84r2m" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.212021 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-87d5477fb-vgjcn"] Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.282546 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb8bc4c8f-w7dhb\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.282599 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-httpd-config\") pod \"neutron-87d5477fb-vgjcn\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.282669 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgc5g\" (UniqueName: \"kubernetes.io/projected/f320f12a-3d58-4c69-8191-29399923abe2-kube-api-access-hgc5g\") pod \"neutron-87d5477fb-vgjcn\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.282711 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-config\") pod \"neutron-87d5477fb-vgjcn\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.282741 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-dns-swift-storage-0\") pod \"dnsmasq-dns-6cb8bc4c8f-w7dhb\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.282871 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-ovndb-tls-certs\") pod \"neutron-87d5477fb-vgjcn\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.282920 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-config\") pod \"dnsmasq-dns-6cb8bc4c8f-w7dhb\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.282949 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-dns-svc\") pod \"dnsmasq-dns-6cb8bc4c8f-w7dhb\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.282974 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb8bc4c8f-w7dhb\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.283045 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k6hq\" (UniqueName: \"kubernetes.io/projected/d62f273f-e2ef-4c5d-a319-1f38ee22d823-kube-api-access-7k6hq\") pod \"dnsmasq-dns-6cb8bc4c8f-w7dhb\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.283130 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-combined-ca-bundle\") pod \"neutron-87d5477fb-vgjcn\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.283371 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb8bc4c8f-w7dhb\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.285203 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb8bc4c8f-w7dhb\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.285881 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-config\") pod \"dnsmasq-dns-6cb8bc4c8f-w7dhb\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.285950 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-dns-svc\") pod \"dnsmasq-dns-6cb8bc4c8f-w7dhb\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.286212 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-dns-swift-storage-0\") pod \"dnsmasq-dns-6cb8bc4c8f-w7dhb\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.298860 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k6hq\" (UniqueName: \"kubernetes.io/projected/d62f273f-e2ef-4c5d-a319-1f38ee22d823-kube-api-access-7k6hq\") pod \"dnsmasq-dns-6cb8bc4c8f-w7dhb\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.384550 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-httpd-config\") pod \"neutron-87d5477fb-vgjcn\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.384632 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgc5g\" (UniqueName: \"kubernetes.io/projected/f320f12a-3d58-4c69-8191-29399923abe2-kube-api-access-hgc5g\") pod \"neutron-87d5477fb-vgjcn\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.384665 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-config\") pod \"neutron-87d5477fb-vgjcn\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.384712 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-ovndb-tls-certs\") pod \"neutron-87d5477fb-vgjcn\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.384831 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-combined-ca-bundle\") pod \"neutron-87d5477fb-vgjcn\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.395518 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-httpd-config\") pod \"neutron-87d5477fb-vgjcn\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.395577 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-ovndb-tls-certs\") pod \"neutron-87d5477fb-vgjcn\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.395791 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-config\") pod \"neutron-87d5477fb-vgjcn\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.403803 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgc5g\" (UniqueName: \"kubernetes.io/projected/f320f12a-3d58-4c69-8191-29399923abe2-kube-api-access-hgc5g\") pod \"neutron-87d5477fb-vgjcn\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.404818 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.406233 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-combined-ca-bundle\") pod \"neutron-87d5477fb-vgjcn\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.511843 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.705750 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9","Type":"ContainerStarted","Data":"240b2e8d8dc100bf56e7584a19a351332d0a487d5a4e54f65953e6adc4385c9e"} Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.713260 4786 generic.go:334] "Generic (PLEG): container finished" podID="9f7b97c5-b382-4fdf-bb25-b384b16eb1f6" containerID="f239a343a449fdf0fa68f6585f065e334559dde91bd48427401248d6a8585064" exitCode=0 Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.713371 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xt8ms" event={"ID":"9f7b97c5-b382-4fdf-bb25-b384b16eb1f6","Type":"ContainerDied","Data":"f239a343a449fdf0fa68f6585f065e334559dde91bd48427401248d6a8585064"} Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.731739 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"03227959-451d-483a-8d46-182fa634d20d","Type":"ContainerStarted","Data":"30af55bad49ce6d516e4692f2f4a9c9559edef50826ed2854f3933c610198513"} Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.731779 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"03227959-451d-483a-8d46-182fa634d20d","Type":"ContainerStarted","Data":"da11515d498d1c9cfc8aa4d5acb28c0a6441a3ef751e2022bd4b1dcfe0edd692"} Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.758355 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.758330644 podStartE2EDuration="8.758330644s" podCreationTimestamp="2025-10-02 07:01:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:25.75108647 +0000 UTC m=+895.872269611" watchObservedRunningTime="2025-10-02 07:01:25.758330644 +0000 UTC m=+895.879513775" Oct 02 07:01:25 crc kubenswrapper[4786]: I1002 07:01:25.855921 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb"] Oct 02 07:01:25 crc kubenswrapper[4786]: W1002 07:01:25.862271 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd62f273f_e2ef_4c5d_a319_1f38ee22d823.slice/crio-754e11bb94723bd0be8be64f0fdff298e823cc3f7a687c5aab9e46e1570d63d1 WatchSource:0}: Error finding container 754e11bb94723bd0be8be64f0fdff298e823cc3f7a687c5aab9e46e1570d63d1: Status 404 returned error can't find the container with id 754e11bb94723bd0be8be64f0fdff298e823cc3f7a687c5aab9e46e1570d63d1 Oct 02 07:01:26 crc kubenswrapper[4786]: I1002 07:01:26.082266 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-87d5477fb-vgjcn"] Oct 02 07:01:26 crc kubenswrapper[4786]: W1002 07:01:26.130184 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf320f12a_3d58_4c69_8191_29399923abe2.slice/crio-f6e65a19cd209ffa3a65af8dd0d0a665eea90a6c407b8afb8bc15c99ebc1c585 WatchSource:0}: Error finding container f6e65a19cd209ffa3a65af8dd0d0a665eea90a6c407b8afb8bc15c99ebc1c585: Status 404 returned error can't find the container with id f6e65a19cd209ffa3a65af8dd0d0a665eea90a6c407b8afb8bc15c99ebc1c585 Oct 02 07:01:26 crc kubenswrapper[4786]: I1002 07:01:26.187270 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b02fae5-2435-4676-983f-f60bcd4d58a0" path="/var/lib/kubelet/pods/0b02fae5-2435-4676-983f-f60bcd4d58a0/volumes" Oct 02 07:01:26 crc kubenswrapper[4786]: I1002 07:01:26.739918 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9","Type":"ContainerStarted","Data":"472aa8696de11685976013caec961f1fa747d29b603235e575e9fa35d4c31de7"} Oct 02 07:01:26 crc kubenswrapper[4786]: I1002 07:01:26.742773 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-87d5477fb-vgjcn" event={"ID":"f320f12a-3d58-4c69-8191-29399923abe2","Type":"ContainerStarted","Data":"3d07d32eb08fa90802051e7d312bcebf5b44e7daf7802d97feb236249b2c67ef"} Oct 02 07:01:26 crc kubenswrapper[4786]: I1002 07:01:26.742819 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-87d5477fb-vgjcn" event={"ID":"f320f12a-3d58-4c69-8191-29399923abe2","Type":"ContainerStarted","Data":"4e38365225cee24aff4e7e67e68d3ea58798295979f740a7bf6116aa24a91b66"} Oct 02 07:01:26 crc kubenswrapper[4786]: I1002 07:01:26.742832 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-87d5477fb-vgjcn" event={"ID":"f320f12a-3d58-4c69-8191-29399923abe2","Type":"ContainerStarted","Data":"f6e65a19cd209ffa3a65af8dd0d0a665eea90a6c407b8afb8bc15c99ebc1c585"} Oct 02 07:01:26 crc kubenswrapper[4786]: I1002 07:01:26.742891 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:01:26 crc kubenswrapper[4786]: I1002 07:01:26.745090 4786 generic.go:334] "Generic (PLEG): container finished" podID="d62f273f-e2ef-4c5d-a319-1f38ee22d823" containerID="7e7d2c65006a0ce09651b668934485c8654baeb54290a7734c7beee532f068e2" exitCode=0 Oct 02 07:01:26 crc kubenswrapper[4786]: I1002 07:01:26.745365 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" event={"ID":"d62f273f-e2ef-4c5d-a319-1f38ee22d823","Type":"ContainerDied","Data":"7e7d2c65006a0ce09651b668934485c8654baeb54290a7734c7beee532f068e2"} Oct 02 07:01:26 crc kubenswrapper[4786]: I1002 07:01:26.745402 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" event={"ID":"d62f273f-e2ef-4c5d-a319-1f38ee22d823","Type":"ContainerStarted","Data":"754e11bb94723bd0be8be64f0fdff298e823cc3f7a687c5aab9e46e1570d63d1"} Oct 02 07:01:26 crc kubenswrapper[4786]: I1002 07:01:26.758169 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.758155582 podStartE2EDuration="9.758155582s" podCreationTimestamp="2025-10-02 07:01:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:26.756392085 +0000 UTC m=+896.877575216" watchObservedRunningTime="2025-10-02 07:01:26.758155582 +0000 UTC m=+896.879338713" Oct 02 07:01:26 crc kubenswrapper[4786]: I1002 07:01:26.797365 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-87d5477fb-vgjcn" podStartSLOduration=1.797340986 podStartE2EDuration="1.797340986s" podCreationTimestamp="2025-10-02 07:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:26.790585364 +0000 UTC m=+896.911768505" watchObservedRunningTime="2025-10-02 07:01:26.797340986 +0000 UTC m=+896.918524117" Oct 02 07:01:26 crc kubenswrapper[4786]: I1002 07:01:26.987227 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xt8ms" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.125301 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rqpg\" (UniqueName: \"kubernetes.io/projected/9f7b97c5-b382-4fdf-bb25-b384b16eb1f6-kube-api-access-6rqpg\") pod \"9f7b97c5-b382-4fdf-bb25-b384b16eb1f6\" (UID: \"9f7b97c5-b382-4fdf-bb25-b384b16eb1f6\") " Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.125563 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f7b97c5-b382-4fdf-bb25-b384b16eb1f6-db-sync-config-data\") pod \"9f7b97c5-b382-4fdf-bb25-b384b16eb1f6\" (UID: \"9f7b97c5-b382-4fdf-bb25-b384b16eb1f6\") " Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.125923 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f7b97c5-b382-4fdf-bb25-b384b16eb1f6-combined-ca-bundle\") pod \"9f7b97c5-b382-4fdf-bb25-b384b16eb1f6\" (UID: \"9f7b97c5-b382-4fdf-bb25-b384b16eb1f6\") " Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.131302 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f7b97c5-b382-4fdf-bb25-b384b16eb1f6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9f7b97c5-b382-4fdf-bb25-b384b16eb1f6" (UID: "9f7b97c5-b382-4fdf-bb25-b384b16eb1f6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.131763 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f7b97c5-b382-4fdf-bb25-b384b16eb1f6-kube-api-access-6rqpg" (OuterVolumeSpecName: "kube-api-access-6rqpg") pod "9f7b97c5-b382-4fdf-bb25-b384b16eb1f6" (UID: "9f7b97c5-b382-4fdf-bb25-b384b16eb1f6"). InnerVolumeSpecName "kube-api-access-6rqpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.149681 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f7b97c5-b382-4fdf-bb25-b384b16eb1f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f7b97c5-b382-4fdf-bb25-b384b16eb1f6" (UID: "9f7b97c5-b382-4fdf-bb25-b384b16eb1f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.232508 4786 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f7b97c5-b382-4fdf-bb25-b384b16eb1f6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.232537 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f7b97c5-b382-4fdf-bb25-b384b16eb1f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.232566 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rqpg\" (UniqueName: \"kubernetes.io/projected/9f7b97c5-b382-4fdf-bb25-b384b16eb1f6-kube-api-access-6rqpg\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.274680 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5c985b58cc-mrm2m"] Oct 02 07:01:27 crc kubenswrapper[4786]: E1002 07:01:27.275261 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f7b97c5-b382-4fdf-bb25-b384b16eb1f6" containerName="barbican-db-sync" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.275281 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f7b97c5-b382-4fdf-bb25-b384b16eb1f6" containerName="barbican-db-sync" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.275607 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f7b97c5-b382-4fdf-bb25-b384b16eb1f6" containerName="barbican-db-sync" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.276778 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.280547 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.280889 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.287742 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c985b58cc-mrm2m"] Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.335270 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89685228-db75-441e-83ae-74720db4de72-internal-tls-certs\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.335586 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z2rn\" (UniqueName: \"kubernetes.io/projected/89685228-db75-441e-83ae-74720db4de72-kube-api-access-2z2rn\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.335618 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89685228-db75-441e-83ae-74720db4de72-combined-ca-bundle\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.335835 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89685228-db75-441e-83ae-74720db4de72-public-tls-certs\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.336087 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89685228-db75-441e-83ae-74720db4de72-config\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.336182 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/89685228-db75-441e-83ae-74720db4de72-httpd-config\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.336255 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89685228-db75-441e-83ae-74720db4de72-ovndb-tls-certs\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.438406 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89685228-db75-441e-83ae-74720db4de72-internal-tls-certs\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.438478 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89685228-db75-441e-83ae-74720db4de72-combined-ca-bundle\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.438495 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z2rn\" (UniqueName: \"kubernetes.io/projected/89685228-db75-441e-83ae-74720db4de72-kube-api-access-2z2rn\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.438518 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89685228-db75-441e-83ae-74720db4de72-public-tls-certs\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.438667 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89685228-db75-441e-83ae-74720db4de72-config\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.438722 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/89685228-db75-441e-83ae-74720db4de72-httpd-config\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.438762 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89685228-db75-441e-83ae-74720db4de72-ovndb-tls-certs\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.442109 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/89685228-db75-441e-83ae-74720db4de72-ovndb-tls-certs\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.442792 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/89685228-db75-441e-83ae-74720db4de72-httpd-config\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.443345 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/89685228-db75-441e-83ae-74720db4de72-config\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.443457 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89685228-db75-441e-83ae-74720db4de72-combined-ca-bundle\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.443794 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/89685228-db75-441e-83ae-74720db4de72-internal-tls-certs\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.444597 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89685228-db75-441e-83ae-74720db4de72-public-tls-certs\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.453758 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z2rn\" (UniqueName: \"kubernetes.io/projected/89685228-db75-441e-83ae-74720db4de72-kube-api-access-2z2rn\") pod \"neutron-5c985b58cc-mrm2m\" (UID: \"89685228-db75-441e-83ae-74720db4de72\") " pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.497754 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.497803 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.597923 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.762990 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xt8ms" event={"ID":"9f7b97c5-b382-4fdf-bb25-b384b16eb1f6","Type":"ContainerDied","Data":"48806de6aa5dd587c633e238f0f7406b536afb19e9903c108522003e76df4a6d"} Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.763021 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48806de6aa5dd587c633e238f0f7406b536afb19e9903c108522003e76df4a6d" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.763084 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xt8ms" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.777809 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" event={"ID":"d62f273f-e2ef-4c5d-a319-1f38ee22d823","Type":"ContainerStarted","Data":"365b68d0b2dd6598bfa7ba3198fc52d68d666b8e55c3e46982c925e05253937d"} Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.778535 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.797126 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" podStartSLOduration=2.797112582 podStartE2EDuration="2.797112582s" podCreationTimestamp="2025-10-02 07:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:27.794377995 +0000 UTC m=+897.915561126" watchObservedRunningTime="2025-10-02 07:01:27.797112582 +0000 UTC m=+897.918295714" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.924428 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5fc459f649-l589x"] Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.925855 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5fc459f649-l589x" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.929989 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.938845 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-755cbf448b-g8s9l"] Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.940063 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.946785 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.947171 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.947318 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-s2wrb" Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.954133 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5fc459f649-l589x"] Oct 02 07:01:27 crc kubenswrapper[4786]: I1002 07:01:27.964853 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-755cbf448b-g8s9l"] Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.004619 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb"] Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.018741 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84cf884f69-szg6z"] Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.019919 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.046787 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84cf884f69-szg6z"] Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.051772 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/812267df-c387-42a6-a6b4-758beccdd77d-combined-ca-bundle\") pod \"barbican-keystone-listener-755cbf448b-g8s9l\" (UID: \"812267df-c387-42a6-a6b4-758beccdd77d\") " pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.051818 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/812267df-c387-42a6-a6b4-758beccdd77d-config-data-custom\") pod \"barbican-keystone-listener-755cbf448b-g8s9l\" (UID: \"812267df-c387-42a6-a6b4-758beccdd77d\") " pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.051841 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/812267df-c387-42a6-a6b4-758beccdd77d-logs\") pod \"barbican-keystone-listener-755cbf448b-g8s9l\" (UID: \"812267df-c387-42a6-a6b4-758beccdd77d\") " pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.051866 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af54426b-d853-4806-bdf8-c1fd22cb6752-config-data\") pod \"barbican-worker-5fc459f649-l589x\" (UID: \"af54426b-d853-4806-bdf8-c1fd22cb6752\") " pod="openstack/barbican-worker-5fc459f649-l589x" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.051993 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54jg5\" (UniqueName: \"kubernetes.io/projected/af54426b-d853-4806-bdf8-c1fd22cb6752-kube-api-access-54jg5\") pod \"barbican-worker-5fc459f649-l589x\" (UID: \"af54426b-d853-4806-bdf8-c1fd22cb6752\") " pod="openstack/barbican-worker-5fc459f649-l589x" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.052056 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af54426b-d853-4806-bdf8-c1fd22cb6752-logs\") pod \"barbican-worker-5fc459f649-l589x\" (UID: \"af54426b-d853-4806-bdf8-c1fd22cb6752\") " pod="openstack/barbican-worker-5fc459f649-l589x" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.052101 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/812267df-c387-42a6-a6b4-758beccdd77d-config-data\") pod \"barbican-keystone-listener-755cbf448b-g8s9l\" (UID: \"812267df-c387-42a6-a6b4-758beccdd77d\") " pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.052124 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktghv\" (UniqueName: \"kubernetes.io/projected/812267df-c387-42a6-a6b4-758beccdd77d-kube-api-access-ktghv\") pod \"barbican-keystone-listener-755cbf448b-g8s9l\" (UID: \"812267df-c387-42a6-a6b4-758beccdd77d\") " pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.052142 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af54426b-d853-4806-bdf8-c1fd22cb6752-config-data-custom\") pod \"barbican-worker-5fc459f649-l589x\" (UID: \"af54426b-d853-4806-bdf8-c1fd22cb6752\") " pod="openstack/barbican-worker-5fc459f649-l589x" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.052232 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af54426b-d853-4806-bdf8-c1fd22cb6752-combined-ca-bundle\") pod \"barbican-worker-5fc459f649-l589x\" (UID: \"af54426b-d853-4806-bdf8-c1fd22cb6752\") " pod="openstack/barbican-worker-5fc459f649-l589x" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.057211 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.057237 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.073889 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.073922 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.097440 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5c598894db-vwfzw"] Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.098563 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.103040 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.109913 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.111636 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c598894db-vwfzw"] Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.121739 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.138618 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.142163 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153190 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/812267df-c387-42a6-a6b4-758beccdd77d-config-data\") pod \"barbican-keystone-listener-755cbf448b-g8s9l\" (UID: \"812267df-c387-42a6-a6b4-758beccdd77d\") " pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153222 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktghv\" (UniqueName: \"kubernetes.io/projected/812267df-c387-42a6-a6b4-758beccdd77d-kube-api-access-ktghv\") pod \"barbican-keystone-listener-755cbf448b-g8s9l\" (UID: \"812267df-c387-42a6-a6b4-758beccdd77d\") " pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153246 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af54426b-d853-4806-bdf8-c1fd22cb6752-config-data-custom\") pod \"barbican-worker-5fc459f649-l589x\" (UID: \"af54426b-d853-4806-bdf8-c1fd22cb6752\") " pod="openstack/barbican-worker-5fc459f649-l589x" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153282 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpt8q\" (UniqueName: \"kubernetes.io/projected/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-kube-api-access-bpt8q\") pod \"dnsmasq-dns-84cf884f69-szg6z\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153312 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36d65eb2-c554-4ca7-a21a-16375fcbd118-config-data\") pod \"barbican-api-5c598894db-vwfzw\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153340 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36d65eb2-c554-4ca7-a21a-16375fcbd118-combined-ca-bundle\") pod \"barbican-api-5c598894db-vwfzw\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153356 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af54426b-d853-4806-bdf8-c1fd22cb6752-combined-ca-bundle\") pod \"barbican-worker-5fc459f649-l589x\" (UID: \"af54426b-d853-4806-bdf8-c1fd22cb6752\") " pod="openstack/barbican-worker-5fc459f649-l589x" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153371 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-dns-swift-storage-0\") pod \"dnsmasq-dns-84cf884f69-szg6z\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153428 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-ovsdbserver-sb\") pod \"dnsmasq-dns-84cf884f69-szg6z\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153446 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/812267df-c387-42a6-a6b4-758beccdd77d-combined-ca-bundle\") pod \"barbican-keystone-listener-755cbf448b-g8s9l\" (UID: \"812267df-c387-42a6-a6b4-758beccdd77d\") " pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153484 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36d65eb2-c554-4ca7-a21a-16375fcbd118-config-data-custom\") pod \"barbican-api-5c598894db-vwfzw\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153503 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/812267df-c387-42a6-a6b4-758beccdd77d-config-data-custom\") pod \"barbican-keystone-listener-755cbf448b-g8s9l\" (UID: \"812267df-c387-42a6-a6b4-758beccdd77d\") " pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153522 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/812267df-c387-42a6-a6b4-758beccdd77d-logs\") pod \"barbican-keystone-listener-755cbf448b-g8s9l\" (UID: \"812267df-c387-42a6-a6b4-758beccdd77d\") " pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153557 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-dns-svc\") pod \"dnsmasq-dns-84cf884f69-szg6z\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153574 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af54426b-d853-4806-bdf8-c1fd22cb6752-config-data\") pod \"barbican-worker-5fc459f649-l589x\" (UID: \"af54426b-d853-4806-bdf8-c1fd22cb6752\") " pod="openstack/barbican-worker-5fc459f649-l589x" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153628 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54jg5\" (UniqueName: \"kubernetes.io/projected/af54426b-d853-4806-bdf8-c1fd22cb6752-kube-api-access-54jg5\") pod \"barbican-worker-5fc459f649-l589x\" (UID: \"af54426b-d853-4806-bdf8-c1fd22cb6752\") " pod="openstack/barbican-worker-5fc459f649-l589x" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153644 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36d65eb2-c554-4ca7-a21a-16375fcbd118-logs\") pod \"barbican-api-5c598894db-vwfzw\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153667 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-ovsdbserver-nb\") pod \"dnsmasq-dns-84cf884f69-szg6z\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153684 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af54426b-d853-4806-bdf8-c1fd22cb6752-logs\") pod \"barbican-worker-5fc459f649-l589x\" (UID: \"af54426b-d853-4806-bdf8-c1fd22cb6752\") " pod="openstack/barbican-worker-5fc459f649-l589x" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153717 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-config\") pod \"dnsmasq-dns-84cf884f69-szg6z\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.153731 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99q6g\" (UniqueName: \"kubernetes.io/projected/36d65eb2-c554-4ca7-a21a-16375fcbd118-kube-api-access-99q6g\") pod \"barbican-api-5c598894db-vwfzw\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.155211 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/812267df-c387-42a6-a6b4-758beccdd77d-logs\") pod \"barbican-keystone-listener-755cbf448b-g8s9l\" (UID: \"812267df-c387-42a6-a6b4-758beccdd77d\") " pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.159465 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/af54426b-d853-4806-bdf8-c1fd22cb6752-config-data-custom\") pod \"barbican-worker-5fc459f649-l589x\" (UID: \"af54426b-d853-4806-bdf8-c1fd22cb6752\") " pod="openstack/barbican-worker-5fc459f649-l589x" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.160175 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/812267df-c387-42a6-a6b4-758beccdd77d-config-data\") pod \"barbican-keystone-listener-755cbf448b-g8s9l\" (UID: \"812267df-c387-42a6-a6b4-758beccdd77d\") " pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.160243 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af54426b-d853-4806-bdf8-c1fd22cb6752-logs\") pod \"barbican-worker-5fc459f649-l589x\" (UID: \"af54426b-d853-4806-bdf8-c1fd22cb6752\") " pod="openstack/barbican-worker-5fc459f649-l589x" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.165536 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/812267df-c387-42a6-a6b4-758beccdd77d-config-data-custom\") pod \"barbican-keystone-listener-755cbf448b-g8s9l\" (UID: \"812267df-c387-42a6-a6b4-758beccdd77d\") " pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.166059 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af54426b-d853-4806-bdf8-c1fd22cb6752-combined-ca-bundle\") pod \"barbican-worker-5fc459f649-l589x\" (UID: \"af54426b-d853-4806-bdf8-c1fd22cb6752\") " pod="openstack/barbican-worker-5fc459f649-l589x" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.166683 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af54426b-d853-4806-bdf8-c1fd22cb6752-config-data\") pod \"barbican-worker-5fc459f649-l589x\" (UID: \"af54426b-d853-4806-bdf8-c1fd22cb6752\") " pod="openstack/barbican-worker-5fc459f649-l589x" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.167730 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/812267df-c387-42a6-a6b4-758beccdd77d-combined-ca-bundle\") pod \"barbican-keystone-listener-755cbf448b-g8s9l\" (UID: \"812267df-c387-42a6-a6b4-758beccdd77d\") " pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.175025 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54jg5\" (UniqueName: \"kubernetes.io/projected/af54426b-d853-4806-bdf8-c1fd22cb6752-kube-api-access-54jg5\") pod \"barbican-worker-5fc459f649-l589x\" (UID: \"af54426b-d853-4806-bdf8-c1fd22cb6752\") " pod="openstack/barbican-worker-5fc459f649-l589x" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.175275 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktghv\" (UniqueName: \"kubernetes.io/projected/812267df-c387-42a6-a6b4-758beccdd77d-kube-api-access-ktghv\") pod \"barbican-keystone-listener-755cbf448b-g8s9l\" (UID: \"812267df-c387-42a6-a6b4-758beccdd77d\") " pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.195107 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c985b58cc-mrm2m"] Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.255653 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36d65eb2-c554-4ca7-a21a-16375fcbd118-combined-ca-bundle\") pod \"barbican-api-5c598894db-vwfzw\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.256550 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-dns-swift-storage-0\") pod \"dnsmasq-dns-84cf884f69-szg6z\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.256610 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-ovsdbserver-sb\") pod \"dnsmasq-dns-84cf884f69-szg6z\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.256639 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36d65eb2-c554-4ca7-a21a-16375fcbd118-config-data-custom\") pod \"barbican-api-5c598894db-vwfzw\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.256675 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-dns-svc\") pod \"dnsmasq-dns-84cf884f69-szg6z\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.256740 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36d65eb2-c554-4ca7-a21a-16375fcbd118-logs\") pod \"barbican-api-5c598894db-vwfzw\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.256771 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-ovsdbserver-nb\") pod \"dnsmasq-dns-84cf884f69-szg6z\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.256789 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-config\") pod \"dnsmasq-dns-84cf884f69-szg6z\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.256804 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99q6g\" (UniqueName: \"kubernetes.io/projected/36d65eb2-c554-4ca7-a21a-16375fcbd118-kube-api-access-99q6g\") pod \"barbican-api-5c598894db-vwfzw\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.256840 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpt8q\" (UniqueName: \"kubernetes.io/projected/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-kube-api-access-bpt8q\") pod \"dnsmasq-dns-84cf884f69-szg6z\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.256866 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36d65eb2-c554-4ca7-a21a-16375fcbd118-config-data\") pod \"barbican-api-5c598894db-vwfzw\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.257941 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-dns-svc\") pod \"dnsmasq-dns-84cf884f69-szg6z\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.258111 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-ovsdbserver-sb\") pod \"dnsmasq-dns-84cf884f69-szg6z\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.258158 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-dns-swift-storage-0\") pod \"dnsmasq-dns-84cf884f69-szg6z\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.258466 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36d65eb2-c554-4ca7-a21a-16375fcbd118-logs\") pod \"barbican-api-5c598894db-vwfzw\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.258932 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-ovsdbserver-nb\") pod \"dnsmasq-dns-84cf884f69-szg6z\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.259824 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-config\") pod \"dnsmasq-dns-84cf884f69-szg6z\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.261240 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5fc459f649-l589x" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.265278 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36d65eb2-c554-4ca7-a21a-16375fcbd118-combined-ca-bundle\") pod \"barbican-api-5c598894db-vwfzw\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.265898 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36d65eb2-c554-4ca7-a21a-16375fcbd118-config-data\") pod \"barbican-api-5c598894db-vwfzw\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.273714 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99q6g\" (UniqueName: \"kubernetes.io/projected/36d65eb2-c554-4ca7-a21a-16375fcbd118-kube-api-access-99q6g\") pod \"barbican-api-5c598894db-vwfzw\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.274799 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpt8q\" (UniqueName: \"kubernetes.io/projected/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-kube-api-access-bpt8q\") pod \"dnsmasq-dns-84cf884f69-szg6z\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.275879 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36d65eb2-c554-4ca7-a21a-16375fcbd118-config-data-custom\") pod \"barbican-api-5c598894db-vwfzw\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.277606 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.349680 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.423594 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.782199 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.782232 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.782243 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 07:01:28 crc kubenswrapper[4786]: I1002 07:01:28.782252 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 07:01:29 crc kubenswrapper[4786]: I1002 07:01:29.790614 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" podUID="d62f273f-e2ef-4c5d-a319-1f38ee22d823" containerName="dnsmasq-dns" containerID="cri-o://365b68d0b2dd6598bfa7ba3198fc52d68d666b8e55c3e46982c925e05253937d" gracePeriod=10 Oct 02 07:01:30 crc kubenswrapper[4786]: I1002 07:01:30.802065 4786 generic.go:334] "Generic (PLEG): container finished" podID="d62f273f-e2ef-4c5d-a319-1f38ee22d823" containerID="365b68d0b2dd6598bfa7ba3198fc52d68d666b8e55c3e46982c925e05253937d" exitCode=0 Oct 02 07:01:30 crc kubenswrapper[4786]: I1002 07:01:30.802158 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" event={"ID":"d62f273f-e2ef-4c5d-a319-1f38ee22d823","Type":"ContainerDied","Data":"365b68d0b2dd6598bfa7ba3198fc52d68d666b8e55c3e46982c925e05253937d"} Oct 02 07:01:30 crc kubenswrapper[4786]: W1002 07:01:30.874114 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89685228_db75_441e_83ae_74720db4de72.slice/crio-a08051da76e4f0f2c9a3e2d0767867e5e2e59fdb0d2641582195385e619921f8 WatchSource:0}: Error finding container a08051da76e4f0f2c9a3e2d0767867e5e2e59fdb0d2641582195385e619921f8: Status 404 returned error can't find the container with id a08051da76e4f0f2c9a3e2d0767867e5e2e59fdb0d2641582195385e619921f8 Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.734474 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6c4f87b654-hv865"] Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.736119 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.740066 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.740239 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.748716 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6c4f87b654-hv865"] Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.816707 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c985b58cc-mrm2m" event={"ID":"89685228-db75-441e-83ae-74720db4de72","Type":"ContainerStarted","Data":"a08051da76e4f0f2c9a3e2d0767867e5e2e59fdb0d2641582195385e619921f8"} Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.841850 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed22a27c-dae6-448c-b789-85add05aff31-internal-tls-certs\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.841885 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed22a27c-dae6-448c-b789-85add05aff31-config-data\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.841920 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed22a27c-dae6-448c-b789-85add05aff31-combined-ca-bundle\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.841940 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed22a27c-dae6-448c-b789-85add05aff31-logs\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.842089 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed22a27c-dae6-448c-b789-85add05aff31-public-tls-certs\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.842256 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed22a27c-dae6-448c-b789-85add05aff31-config-data-custom\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.842371 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69lsr\" (UniqueName: \"kubernetes.io/projected/ed22a27c-dae6-448c-b789-85add05aff31-kube-api-access-69lsr\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.944349 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69lsr\" (UniqueName: \"kubernetes.io/projected/ed22a27c-dae6-448c-b789-85add05aff31-kube-api-access-69lsr\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.944526 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed22a27c-dae6-448c-b789-85add05aff31-internal-tls-certs\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.944549 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed22a27c-dae6-448c-b789-85add05aff31-config-data\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.944616 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed22a27c-dae6-448c-b789-85add05aff31-combined-ca-bundle\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.944644 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed22a27c-dae6-448c-b789-85add05aff31-logs\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.944677 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed22a27c-dae6-448c-b789-85add05aff31-public-tls-certs\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.944756 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed22a27c-dae6-448c-b789-85add05aff31-config-data-custom\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.945981 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed22a27c-dae6-448c-b789-85add05aff31-logs\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.950827 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed22a27c-dae6-448c-b789-85add05aff31-combined-ca-bundle\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.951634 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed22a27c-dae6-448c-b789-85add05aff31-config-data\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.952174 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed22a27c-dae6-448c-b789-85add05aff31-internal-tls-certs\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.953826 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed22a27c-dae6-448c-b789-85add05aff31-config-data-custom\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.954436 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed22a27c-dae6-448c-b789-85add05aff31-public-tls-certs\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:31 crc kubenswrapper[4786]: I1002 07:01:31.959962 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69lsr\" (UniqueName: \"kubernetes.io/projected/ed22a27c-dae6-448c-b789-85add05aff31-kube-api-access-69lsr\") pod \"barbican-api-6c4f87b654-hv865\" (UID: \"ed22a27c-dae6-448c-b789-85add05aff31\") " pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:32 crc kubenswrapper[4786]: I1002 07:01:32.053748 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:37 crc kubenswrapper[4786]: E1002 07:01:37.475126 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48" Oct 02 07:01:37 crc kubenswrapper[4786]: E1002 07:01:37.476034 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rblwg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(46da85da-85a6-4ff7-8410-c26ccd99967e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 07:01:37 crc kubenswrapper[4786]: E1002 07:01:37.477292 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="46da85da-85a6-4ff7-8410-c26ccd99967e" Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.690264 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.759944 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-dns-swift-storage-0\") pod \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.759997 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-ovsdbserver-nb\") pod \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.760058 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k6hq\" (UniqueName: \"kubernetes.io/projected/d62f273f-e2ef-4c5d-a319-1f38ee22d823-kube-api-access-7k6hq\") pod \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.760196 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-config\") pod \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.760273 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-dns-svc\") pod \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.760439 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-ovsdbserver-sb\") pod \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\" (UID: \"d62f273f-e2ef-4c5d-a319-1f38ee22d823\") " Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.765675 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d62f273f-e2ef-4c5d-a319-1f38ee22d823-kube-api-access-7k6hq" (OuterVolumeSpecName: "kube-api-access-7k6hq") pod "d62f273f-e2ef-4c5d-a319-1f38ee22d823" (UID: "d62f273f-e2ef-4c5d-a319-1f38ee22d823"). InnerVolumeSpecName "kube-api-access-7k6hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.798617 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d62f273f-e2ef-4c5d-a319-1f38ee22d823" (UID: "d62f273f-e2ef-4c5d-a319-1f38ee22d823"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.800915 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d62f273f-e2ef-4c5d-a319-1f38ee22d823" (UID: "d62f273f-e2ef-4c5d-a319-1f38ee22d823"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.804453 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-config" (OuterVolumeSpecName: "config") pod "d62f273f-e2ef-4c5d-a319-1f38ee22d823" (UID: "d62f273f-e2ef-4c5d-a319-1f38ee22d823"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.805864 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d62f273f-e2ef-4c5d-a319-1f38ee22d823" (UID: "d62f273f-e2ef-4c5d-a319-1f38ee22d823"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.816108 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d62f273f-e2ef-4c5d-a319-1f38ee22d823" (UID: "d62f273f-e2ef-4c5d-a319-1f38ee22d823"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.863303 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.863328 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.863342 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k6hq\" (UniqueName: \"kubernetes.io/projected/d62f273f-e2ef-4c5d-a319-1f38ee22d823-kube-api-access-7k6hq\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.863352 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.863362 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.863370 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d62f273f-e2ef-4c5d-a319-1f38ee22d823-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.870739 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c985b58cc-mrm2m" event={"ID":"89685228-db75-441e-83ae-74720db4de72","Type":"ContainerStarted","Data":"2d3aa1f0d9c123c9eaed2f2a12977c6c80fe5c2a3a6a291b799ebb3e2aadbbe3"} Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.872650 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.872626 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" event={"ID":"d62f273f-e2ef-4c5d-a319-1f38ee22d823","Type":"ContainerDied","Data":"754e11bb94723bd0be8be64f0fdff298e823cc3f7a687c5aab9e46e1570d63d1"} Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.872735 4786 scope.go:117] "RemoveContainer" containerID="365b68d0b2dd6598bfa7ba3198fc52d68d666b8e55c3e46982c925e05253937d" Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.872873 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="46da85da-85a6-4ff7-8410-c26ccd99967e" containerName="sg-core" containerID="cri-o://d725fe44277f0a83e913ed067f7f05ca9e57cadd8fdd80931e59e56ef7973a73" gracePeriod=30 Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.872874 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="46da85da-85a6-4ff7-8410-c26ccd99967e" containerName="ceilometer-notification-agent" containerID="cri-o://313fb5372e8a5c77898d9f6cf7e8584a9bbf717ffb6c3da2aa1f19f15298ac97" gracePeriod=30 Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.873108 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="46da85da-85a6-4ff7-8410-c26ccd99967e" containerName="ceilometer-central-agent" containerID="cri-o://acea46ce74fcd747001e70e1a189e72bcbd4c85e213c49055b801c8197f289f7" gracePeriod=30 Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.908754 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-755cbf448b-g8s9l"] Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.912744 4786 scope.go:117] "RemoveContainer" containerID="7e7d2c65006a0ce09651b668934485c8654baeb54290a7734c7beee532f068e2" Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.916904 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb"] Oct 02 07:01:37 crc kubenswrapper[4786]: I1002 07:01:37.923466 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb"] Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.006236 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5fc459f649-l589x"] Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.011680 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84cf884f69-szg6z"] Oct 02 07:01:38 crc kubenswrapper[4786]: W1002 07:01:38.018724 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0206ed2e_dfee_4f7e_8a2e_5b2158e69a57.slice/crio-23bfbed7fa0ae0152f8200c22ca2ddae14f2d10ef420aa2e9402c3ee1fc4ac10 WatchSource:0}: Error finding container 23bfbed7fa0ae0152f8200c22ca2ddae14f2d10ef420aa2e9402c3ee1fc4ac10: Status 404 returned error can't find the container with id 23bfbed7fa0ae0152f8200c22ca2ddae14f2d10ef420aa2e9402c3ee1fc4ac10 Oct 02 07:01:38 crc kubenswrapper[4786]: W1002 07:01:38.021060 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf54426b_d853_4806_bdf8_c1fd22cb6752.slice/crio-14dd9a531ff229a8344106f21ee45fd5d772879b1c3abbec69c37492f45fdac5 WatchSource:0}: Error finding container 14dd9a531ff229a8344106f21ee45fd5d772879b1c3abbec69c37492f45fdac5: Status 404 returned error can't find the container with id 14dd9a531ff229a8344106f21ee45fd5d772879b1c3abbec69c37492f45fdac5 Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.100184 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c598894db-vwfzw"] Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.109744 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6c4f87b654-hv865"] Oct 02 07:01:38 crc kubenswrapper[4786]: W1002 07:01:38.116035 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36d65eb2_c554_4ca7_a21a_16375fcbd118.slice/crio-0ffb17f61d81dec1353b34bb85f414135293be0aee3f94dbb70f249e6e45c068 WatchSource:0}: Error finding container 0ffb17f61d81dec1353b34bb85f414135293be0aee3f94dbb70f249e6e45c068: Status 404 returned error can't find the container with id 0ffb17f61d81dec1353b34bb85f414135293be0aee3f94dbb70f249e6e45c068 Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.191613 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d62f273f-e2ef-4c5d-a319-1f38ee22d823" path="/var/lib/kubelet/pods/d62f273f-e2ef-4c5d-a319-1f38ee22d823/volumes" Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.880770 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c598894db-vwfzw" event={"ID":"36d65eb2-c554-4ca7-a21a-16375fcbd118","Type":"ContainerStarted","Data":"1820a9e6d371f32d52d64ca297b3853183aa70835c90ee39ebb6325567c98515"} Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.881022 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.881035 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c598894db-vwfzw" event={"ID":"36d65eb2-c554-4ca7-a21a-16375fcbd118","Type":"ContainerStarted","Data":"ab32ae0c1759456161e1b96fd1154c1f08e8404006af167e359ed41c34533269"} Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.881045 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c598894db-vwfzw" event={"ID":"36d65eb2-c554-4ca7-a21a-16375fcbd118","Type":"ContainerStarted","Data":"0ffb17f61d81dec1353b34bb85f414135293be0aee3f94dbb70f249e6e45c068"} Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.881054 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.885409 4786 generic.go:334] "Generic (PLEG): container finished" podID="46da85da-85a6-4ff7-8410-c26ccd99967e" containerID="d725fe44277f0a83e913ed067f7f05ca9e57cadd8fdd80931e59e56ef7973a73" exitCode=2 Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.885433 4786 generic.go:334] "Generic (PLEG): container finished" podID="46da85da-85a6-4ff7-8410-c26ccd99967e" containerID="acea46ce74fcd747001e70e1a189e72bcbd4c85e213c49055b801c8197f289f7" exitCode=0 Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.885470 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46da85da-85a6-4ff7-8410-c26ccd99967e","Type":"ContainerDied","Data":"d725fe44277f0a83e913ed067f7f05ca9e57cadd8fdd80931e59e56ef7973a73"} Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.885504 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46da85da-85a6-4ff7-8410-c26ccd99967e","Type":"ContainerDied","Data":"acea46ce74fcd747001e70e1a189e72bcbd4c85e213c49055b801c8197f289f7"} Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.886506 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g9gtn" event={"ID":"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd","Type":"ContainerStarted","Data":"1ad3d16a6c45db5808d8b0214b1ab84c4ff660a73dc5d71f6a4e0ae76e53f735"} Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.887744 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c985b58cc-mrm2m" event={"ID":"89685228-db75-441e-83ae-74720db4de72","Type":"ContainerStarted","Data":"37fb4db12a5f6ac326831db4ec1a2adaf985f43c21b9399db7bc77c90c387d0a"} Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.887846 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.888961 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5fc459f649-l589x" event={"ID":"af54426b-d853-4806-bdf8-c1fd22cb6752","Type":"ContainerStarted","Data":"14dd9a531ff229a8344106f21ee45fd5d772879b1c3abbec69c37492f45fdac5"} Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.891169 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c4f87b654-hv865" event={"ID":"ed22a27c-dae6-448c-b789-85add05aff31","Type":"ContainerStarted","Data":"45fe36fd49a425db28ebbe34b34b9c9741486db35ea2af1b945fbe5b7163377c"} Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.891192 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c4f87b654-hv865" event={"ID":"ed22a27c-dae6-448c-b789-85add05aff31","Type":"ContainerStarted","Data":"1e5855e0d7196e93efa8dc6e9825160379a4b447b76cb42d2e6dd8e8ad8ec27d"} Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.891205 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.891214 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c4f87b654-hv865" event={"ID":"ed22a27c-dae6-448c-b789-85add05aff31","Type":"ContainerStarted","Data":"f953e1b2314390c43308d2936e51e94b302363fee7fb6455b02cd723652fee83"} Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.891224 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.901248 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5c598894db-vwfzw" podStartSLOduration=10.9012298 podStartE2EDuration="10.9012298s" podCreationTimestamp="2025-10-02 07:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:38.891817546 +0000 UTC m=+909.013000687" watchObservedRunningTime="2025-10-02 07:01:38.9012298 +0000 UTC m=+909.022412932" Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.901919 4786 generic.go:334] "Generic (PLEG): container finished" podID="0206ed2e-dfee-4f7e-8a2e-5b2158e69a57" containerID="ec16f2d593320b44b6042d17b0dce60d137cef3d55fbc7793e1d76cea73085a5" exitCode=0 Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.901978 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84cf884f69-szg6z" event={"ID":"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57","Type":"ContainerDied","Data":"ec16f2d593320b44b6042d17b0dce60d137cef3d55fbc7793e1d76cea73085a5"} Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.902000 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84cf884f69-szg6z" event={"ID":"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57","Type":"ContainerStarted","Data":"23bfbed7fa0ae0152f8200c22ca2ddae14f2d10ef420aa2e9402c3ee1fc4ac10"} Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.903075 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" event={"ID":"812267df-c387-42a6-a6b4-758beccdd77d","Type":"ContainerStarted","Data":"319a09fe56e16ac038f9b3df668b37121a1b52c89decb16ce5d1b51881f22261"} Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.930357 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-g9gtn" podStartSLOduration=3.222744614 podStartE2EDuration="26.930336494s" podCreationTimestamp="2025-10-02 07:01:12 +0000 UTC" firstStartedPulling="2025-10-02 07:01:13.877069855 +0000 UTC m=+883.998252985" lastFinishedPulling="2025-10-02 07:01:37.584661734 +0000 UTC m=+907.705844865" observedRunningTime="2025-10-02 07:01:38.926970213 +0000 UTC m=+909.048153354" watchObservedRunningTime="2025-10-02 07:01:38.930336494 +0000 UTC m=+909.051519624" Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.931217 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6c4f87b654-hv865" podStartSLOduration=7.931206655 podStartE2EDuration="7.931206655s" podCreationTimestamp="2025-10-02 07:01:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:38.912964465 +0000 UTC m=+909.034147596" watchObservedRunningTime="2025-10-02 07:01:38.931206655 +0000 UTC m=+909.052389785" Oct 02 07:01:38 crc kubenswrapper[4786]: I1002 07:01:38.968254 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5c985b58cc-mrm2m" podStartSLOduration=11.968232565 podStartE2EDuration="11.968232565s" podCreationTimestamp="2025-10-02 07:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:38.948923393 +0000 UTC m=+909.070106534" watchObservedRunningTime="2025-10-02 07:01:38.968232565 +0000 UTC m=+909.089415696" Oct 02 07:01:39 crc kubenswrapper[4786]: I1002 07:01:39.911906 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84cf884f69-szg6z" event={"ID":"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57","Type":"ContainerStarted","Data":"55ab00420c35e660dcd2fc4f2fac28967b6659c39aebca83319a65a629905d1c"} Oct 02 07:01:39 crc kubenswrapper[4786]: I1002 07:01:39.936763 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84cf884f69-szg6z" podStartSLOduration=11.936745941 podStartE2EDuration="11.936745941s" podCreationTimestamp="2025-10-02 07:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:39.928176086 +0000 UTC m=+910.049359227" watchObservedRunningTime="2025-10-02 07:01:39.936745941 +0000 UTC m=+910.057929062" Oct 02 07:01:40 crc kubenswrapper[4786]: I1002 07:01:40.405618 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cb8bc4c8f-w7dhb" podUID="d62f273f-e2ef-4c5d-a319-1f38ee22d823" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.147:5353: i/o timeout" Oct 02 07:01:40 crc kubenswrapper[4786]: I1002 07:01:40.921547 4786 generic.go:334] "Generic (PLEG): container finished" podID="46da85da-85a6-4ff7-8410-c26ccd99967e" containerID="313fb5372e8a5c77898d9f6cf7e8584a9bbf717ffb6c3da2aa1f19f15298ac97" exitCode=0 Oct 02 07:01:40 crc kubenswrapper[4786]: I1002 07:01:40.921932 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46da85da-85a6-4ff7-8410-c26ccd99967e","Type":"ContainerDied","Data":"313fb5372e8a5c77898d9f6cf7e8584a9bbf717ffb6c3da2aa1f19f15298ac97"} Oct 02 07:01:40 crc kubenswrapper[4786]: I1002 07:01:40.924058 4786 generic.go:334] "Generic (PLEG): container finished" podID="609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd" containerID="1ad3d16a6c45db5808d8b0214b1ab84c4ff660a73dc5d71f6a4e0ae76e53f735" exitCode=0 Oct 02 07:01:40 crc kubenswrapper[4786]: I1002 07:01:40.924150 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g9gtn" event={"ID":"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd","Type":"ContainerDied","Data":"1ad3d16a6c45db5808d8b0214b1ab84c4ff660a73dc5d71f6a4e0ae76e53f735"} Oct 02 07:01:40 crc kubenswrapper[4786]: I1002 07:01:40.924264 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.034967 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.118987 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46da85da-85a6-4ff7-8410-c26ccd99967e-run-httpd\") pod \"46da85da-85a6-4ff7-8410-c26ccd99967e\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.119035 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rblwg\" (UniqueName: \"kubernetes.io/projected/46da85da-85a6-4ff7-8410-c26ccd99967e-kube-api-access-rblwg\") pod \"46da85da-85a6-4ff7-8410-c26ccd99967e\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.119081 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-combined-ca-bundle\") pod \"46da85da-85a6-4ff7-8410-c26ccd99967e\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.119899 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-sg-core-conf-yaml\") pod \"46da85da-85a6-4ff7-8410-c26ccd99967e\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.119984 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-config-data\") pod \"46da85da-85a6-4ff7-8410-c26ccd99967e\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.120119 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46da85da-85a6-4ff7-8410-c26ccd99967e-log-httpd\") pod \"46da85da-85a6-4ff7-8410-c26ccd99967e\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.120250 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-scripts\") pod \"46da85da-85a6-4ff7-8410-c26ccd99967e\" (UID: \"46da85da-85a6-4ff7-8410-c26ccd99967e\") " Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.121365 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46da85da-85a6-4ff7-8410-c26ccd99967e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "46da85da-85a6-4ff7-8410-c26ccd99967e" (UID: "46da85da-85a6-4ff7-8410-c26ccd99967e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.121740 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46da85da-85a6-4ff7-8410-c26ccd99967e-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.122357 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46da85da-85a6-4ff7-8410-c26ccd99967e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "46da85da-85a6-4ff7-8410-c26ccd99967e" (UID: "46da85da-85a6-4ff7-8410-c26ccd99967e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.124768 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46da85da-85a6-4ff7-8410-c26ccd99967e-kube-api-access-rblwg" (OuterVolumeSpecName: "kube-api-access-rblwg") pod "46da85da-85a6-4ff7-8410-c26ccd99967e" (UID: "46da85da-85a6-4ff7-8410-c26ccd99967e"). InnerVolumeSpecName "kube-api-access-rblwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.125831 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-scripts" (OuterVolumeSpecName: "scripts") pod "46da85da-85a6-4ff7-8410-c26ccd99967e" (UID: "46da85da-85a6-4ff7-8410-c26ccd99967e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.141343 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "46da85da-85a6-4ff7-8410-c26ccd99967e" (UID: "46da85da-85a6-4ff7-8410-c26ccd99967e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.157897 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-config-data" (OuterVolumeSpecName: "config-data") pod "46da85da-85a6-4ff7-8410-c26ccd99967e" (UID: "46da85da-85a6-4ff7-8410-c26ccd99967e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.159842 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46da85da-85a6-4ff7-8410-c26ccd99967e" (UID: "46da85da-85a6-4ff7-8410-c26ccd99967e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.223129 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.223154 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.223163 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46da85da-85a6-4ff7-8410-c26ccd99967e-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.223171 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.223181 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rblwg\" (UniqueName: \"kubernetes.io/projected/46da85da-85a6-4ff7-8410-c26ccd99967e-kube-api-access-rblwg\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.223191 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46da85da-85a6-4ff7-8410-c26ccd99967e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.936104 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" event={"ID":"812267df-c387-42a6-a6b4-758beccdd77d","Type":"ContainerStarted","Data":"397185b447fc8b6fae586414206f06eaed124ec494d9b7241d9682116264d506"} Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.936449 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" event={"ID":"812267df-c387-42a6-a6b4-758beccdd77d","Type":"ContainerStarted","Data":"ca13812f7e8a64ab1cdb0fe058e1c43a107174a5b952ebf3b3ca010d9ae78d19"} Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.939593 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46da85da-85a6-4ff7-8410-c26ccd99967e","Type":"ContainerDied","Data":"3a505dc4e7b8923b2818287c12249880107aa4b6ab2fe60695f0ae93dda4742a"} Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.939639 4786 scope.go:117] "RemoveContainer" containerID="d725fe44277f0a83e913ed067f7f05ca9e57cadd8fdd80931e59e56ef7973a73" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.939765 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.947996 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5fc459f649-l589x" event={"ID":"af54426b-d853-4806-bdf8-c1fd22cb6752","Type":"ContainerStarted","Data":"298e63be6ec947c7243f5fcbd1ea3bdd8901e4fd98c601d3c0013dc23417565f"} Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.948035 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5fc459f649-l589x" event={"ID":"af54426b-d853-4806-bdf8-c1fd22cb6752","Type":"ContainerStarted","Data":"97bd65b18a1ac8a57d6a94041830a698962ad5d3eba234a7550e57b06cf819ef"} Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.955813 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-755cbf448b-g8s9l" podStartSLOduration=12.126148234 podStartE2EDuration="14.955798965s" podCreationTimestamp="2025-10-02 07:01:27 +0000 UTC" firstStartedPulling="2025-10-02 07:01:37.932097798 +0000 UTC m=+908.053280918" lastFinishedPulling="2025-10-02 07:01:40.761748508 +0000 UTC m=+910.882931649" observedRunningTime="2025-10-02 07:01:41.947343918 +0000 UTC m=+912.068527059" watchObservedRunningTime="2025-10-02 07:01:41.955798965 +0000 UTC m=+912.076982096" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.970090 4786 scope.go:117] "RemoveContainer" containerID="313fb5372e8a5c77898d9f6cf7e8584a9bbf717ffb6c3da2aa1f19f15298ac97" Oct 02 07:01:41 crc kubenswrapper[4786]: I1002 07:01:41.989338 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5fc459f649-l589x" podStartSLOduration=12.24583745 podStartE2EDuration="14.989318042s" podCreationTimestamp="2025-10-02 07:01:27 +0000 UTC" firstStartedPulling="2025-10-02 07:01:38.02287668 +0000 UTC m=+908.144059811" lastFinishedPulling="2025-10-02 07:01:40.766357272 +0000 UTC m=+910.887540403" observedRunningTime="2025-10-02 07:01:41.974772765 +0000 UTC m=+912.095955906" watchObservedRunningTime="2025-10-02 07:01:41.989318042 +0000 UTC m=+912.110501162" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.007141 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.012613 4786 scope.go:117] "RemoveContainer" containerID="acea46ce74fcd747001e70e1a189e72bcbd4c85e213c49055b801c8197f289f7" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.014106 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.028381 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:01:42 crc kubenswrapper[4786]: E1002 07:01:42.028872 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46da85da-85a6-4ff7-8410-c26ccd99967e" containerName="sg-core" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.028890 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="46da85da-85a6-4ff7-8410-c26ccd99967e" containerName="sg-core" Oct 02 07:01:42 crc kubenswrapper[4786]: E1002 07:01:42.028901 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62f273f-e2ef-4c5d-a319-1f38ee22d823" containerName="init" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.028909 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62f273f-e2ef-4c5d-a319-1f38ee22d823" containerName="init" Oct 02 07:01:42 crc kubenswrapper[4786]: E1002 07:01:42.028922 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46da85da-85a6-4ff7-8410-c26ccd99967e" containerName="ceilometer-central-agent" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.028929 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="46da85da-85a6-4ff7-8410-c26ccd99967e" containerName="ceilometer-central-agent" Oct 02 07:01:42 crc kubenswrapper[4786]: E1002 07:01:42.028969 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62f273f-e2ef-4c5d-a319-1f38ee22d823" containerName="dnsmasq-dns" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.028976 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62f273f-e2ef-4c5d-a319-1f38ee22d823" containerName="dnsmasq-dns" Oct 02 07:01:42 crc kubenswrapper[4786]: E1002 07:01:42.028987 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46da85da-85a6-4ff7-8410-c26ccd99967e" containerName="ceilometer-notification-agent" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.028993 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="46da85da-85a6-4ff7-8410-c26ccd99967e" containerName="ceilometer-notification-agent" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.029225 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="46da85da-85a6-4ff7-8410-c26ccd99967e" containerName="sg-core" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.029238 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62f273f-e2ef-4c5d-a319-1f38ee22d823" containerName="dnsmasq-dns" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.029307 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="46da85da-85a6-4ff7-8410-c26ccd99967e" containerName="ceilometer-notification-agent" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.029314 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="46da85da-85a6-4ff7-8410-c26ccd99967e" containerName="ceilometer-central-agent" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.033760 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.036393 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.036759 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.040395 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.136796 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88561e6d-11c9-481c-b10c-db39b7291394-run-httpd\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.136841 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shpms\" (UniqueName: \"kubernetes.io/projected/88561e6d-11c9-481c-b10c-db39b7291394-kube-api-access-shpms\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.136881 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-config-data\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.136966 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.137037 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88561e6d-11c9-481c-b10c-db39b7291394-log-httpd\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.137122 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.137261 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-scripts\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.192311 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46da85da-85a6-4ff7-8410-c26ccd99967e" path="/var/lib/kubelet/pods/46da85da-85a6-4ff7-8410-c26ccd99967e/volumes" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.238536 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88561e6d-11c9-481c-b10c-db39b7291394-log-httpd\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.238642 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.238763 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-scripts\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.238836 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88561e6d-11c9-481c-b10c-db39b7291394-run-httpd\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.238872 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shpms\" (UniqueName: \"kubernetes.io/projected/88561e6d-11c9-481c-b10c-db39b7291394-kube-api-access-shpms\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.238939 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-config-data\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.238987 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.238993 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88561e6d-11c9-481c-b10c-db39b7291394-log-httpd\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.239836 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88561e6d-11c9-481c-b10c-db39b7291394-run-httpd\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.245235 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.245419 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-scripts\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.245715 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-config-data\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.246278 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.248596 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.253110 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shpms\" (UniqueName: \"kubernetes.io/projected/88561e6d-11c9-481c-b10c-db39b7291394-kube-api-access-shpms\") pod \"ceilometer-0\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.340071 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-config-data\") pod \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.340380 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-db-sync-config-data\") pod \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.340410 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-scripts\") pod \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.340442 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-combined-ca-bundle\") pod \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.340463 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-etc-machine-id\") pod \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.340566 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mp25\" (UniqueName: \"kubernetes.io/projected/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-kube-api-access-8mp25\") pod \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\" (UID: \"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd\") " Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.340805 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd" (UID: "609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.341102 4786 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.343524 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd" (UID: "609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.344027 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-kube-api-access-8mp25" (OuterVolumeSpecName: "kube-api-access-8mp25") pod "609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd" (UID: "609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd"). InnerVolumeSpecName "kube-api-access-8mp25". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.344738 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-scripts" (OuterVolumeSpecName: "scripts") pod "609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd" (UID: "609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.347123 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.362927 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd" (UID: "609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.380671 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-config-data" (OuterVolumeSpecName: "config-data") pod "609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd" (UID: "609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.442877 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.442903 4786 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.442912 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.442937 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.442950 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mp25\" (UniqueName: \"kubernetes.io/projected/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd-kube-api-access-8mp25\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.776278 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:01:42 crc kubenswrapper[4786]: W1002 07:01:42.782386 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88561e6d_11c9_481c_b10c_db39b7291394.slice/crio-f298318b5f484db0b50fa93831dce2155e499ab60275d11e6e8873ee1945e88c WatchSource:0}: Error finding container f298318b5f484db0b50fa93831dce2155e499ab60275d11e6e8873ee1945e88c: Status 404 returned error can't find the container with id f298318b5f484db0b50fa93831dce2155e499ab60275d11e6e8873ee1945e88c Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.954376 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88561e6d-11c9-481c-b10c-db39b7291394","Type":"ContainerStarted","Data":"f298318b5f484db0b50fa93831dce2155e499ab60275d11e6e8873ee1945e88c"} Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.956731 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g9gtn" event={"ID":"609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd","Type":"ContainerDied","Data":"ff064ccd007db3ebad58f98aea233a0b117745be4f5fbfa037f2c44c10268a4c"} Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.956762 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g9gtn" Oct 02 07:01:42 crc kubenswrapper[4786]: I1002 07:01:42.956770 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff064ccd007db3ebad58f98aea233a0b117745be4f5fbfa037f2c44c10268a4c" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.212564 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 07:01:43 crc kubenswrapper[4786]: E1002 07:01:43.212902 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd" containerName="cinder-db-sync" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.212928 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd" containerName="cinder-db-sync" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.213144 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd" containerName="cinder-db-sync" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.213952 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.217119 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-v4bzm" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.217144 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.217332 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.217420 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.222434 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.257188 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhg4n\" (UniqueName: \"kubernetes.io/projected/df8b9476-3526-44a6-8a19-486e005a851d-kube-api-access-zhg4n\") pod \"cinder-scheduler-0\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.257240 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.257296 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.257408 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df8b9476-3526-44a6-8a19-486e005a851d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.257477 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-scripts\") pod \"cinder-scheduler-0\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.257530 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-config-data\") pod \"cinder-scheduler-0\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.317482 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84cf884f69-szg6z"] Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.317737 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84cf884f69-szg6z" podUID="0206ed2e-dfee-4f7e-8a2e-5b2158e69a57" containerName="dnsmasq-dns" containerID="cri-o://55ab00420c35e660dcd2fc4f2fac28967b6659c39aebca83319a65a629905d1c" gracePeriod=10 Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.349839 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64d9b8cc-868kq"] Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.351615 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.360041 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64d9b8cc-868kq"] Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.362206 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-scripts\") pod \"cinder-scheduler-0\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.362269 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-config-data\") pod \"cinder-scheduler-0\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.362434 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhg4n\" (UniqueName: \"kubernetes.io/projected/df8b9476-3526-44a6-8a19-486e005a851d-kube-api-access-zhg4n\") pod \"cinder-scheduler-0\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.362455 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.362478 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.362547 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df8b9476-3526-44a6-8a19-486e005a851d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.362660 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df8b9476-3526-44a6-8a19-486e005a851d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.374623 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-scripts\") pod \"cinder-scheduler-0\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.379380 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.382932 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.387312 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhg4n\" (UniqueName: \"kubernetes.io/projected/df8b9476-3526-44a6-8a19-486e005a851d-kube-api-access-zhg4n\") pod \"cinder-scheduler-0\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.387810 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-config-data\") pod \"cinder-scheduler-0\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.465082 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-config\") pod \"dnsmasq-dns-64d9b8cc-868kq\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.465293 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-dns-swift-storage-0\") pod \"dnsmasq-dns-64d9b8cc-868kq\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.465407 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7zvr\" (UniqueName: \"kubernetes.io/projected/aecc9aef-011b-45c9-8e92-0c41524ab191-kube-api-access-d7zvr\") pod \"dnsmasq-dns-64d9b8cc-868kq\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.465504 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-ovsdbserver-nb\") pod \"dnsmasq-dns-64d9b8cc-868kq\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.465593 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-ovsdbserver-sb\") pod \"dnsmasq-dns-64d9b8cc-868kq\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.465786 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-dns-svc\") pod \"dnsmasq-dns-64d9b8cc-868kq\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.485751 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.487509 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.490858 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.508340 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.533276 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.567996 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-ovsdbserver-sb\") pod \"dnsmasq-dns-64d9b8cc-868kq\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.568118 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.568155 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-scripts\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.568222 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-logs\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.568258 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-dns-svc\") pod \"dnsmasq-dns-64d9b8cc-868kq\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.568300 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.568347 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4mnp\" (UniqueName: \"kubernetes.io/projected/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-kube-api-access-p4mnp\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.568392 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-config\") pod \"dnsmasq-dns-64d9b8cc-868kq\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.568447 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-dns-swift-storage-0\") pod \"dnsmasq-dns-64d9b8cc-868kq\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.568467 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-config-data-custom\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.568528 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7zvr\" (UniqueName: \"kubernetes.io/projected/aecc9aef-011b-45c9-8e92-0c41524ab191-kube-api-access-d7zvr\") pod \"dnsmasq-dns-64d9b8cc-868kq\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.568567 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-ovsdbserver-nb\") pod \"dnsmasq-dns-64d9b8cc-868kq\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.568601 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-config-data\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.569014 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-ovsdbserver-sb\") pod \"dnsmasq-dns-64d9b8cc-868kq\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.569767 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-dns-svc\") pod \"dnsmasq-dns-64d9b8cc-868kq\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.570077 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-dns-swift-storage-0\") pod \"dnsmasq-dns-64d9b8cc-868kq\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.570350 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-config\") pod \"dnsmasq-dns-64d9b8cc-868kq\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.570928 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-ovsdbserver-nb\") pod \"dnsmasq-dns-64d9b8cc-868kq\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.590946 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7zvr\" (UniqueName: \"kubernetes.io/projected/aecc9aef-011b-45c9-8e92-0c41524ab191-kube-api-access-d7zvr\") pod \"dnsmasq-dns-64d9b8cc-868kq\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.675984 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-logs\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.676367 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.676426 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4mnp\" (UniqueName: \"kubernetes.io/projected/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-kube-api-access-p4mnp\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.676541 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-config-data-custom\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.676645 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-config-data\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.676756 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.676802 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-scripts\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.677225 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-logs\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.680081 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.682887 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-scripts\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.689266 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.696749 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-config-data-custom\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.701153 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4mnp\" (UniqueName: \"kubernetes.io/projected/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-kube-api-access-p4mnp\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.703331 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-config-data\") pod \"cinder-api-0\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.838369 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.846606 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 07:01:43 crc kubenswrapper[4786]: I1002 07:01:43.969225 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.006081 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88561e6d-11c9-481c-b10c-db39b7291394","Type":"ContainerStarted","Data":"ae9d1b6ebd0d0c544fe25da4b98dc1e992ee9b4a8b396564e96759437f102b3e"} Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.018218 4786 generic.go:334] "Generic (PLEG): container finished" podID="0206ed2e-dfee-4f7e-8a2e-5b2158e69a57" containerID="55ab00420c35e660dcd2fc4f2fac28967b6659c39aebca83319a65a629905d1c" exitCode=0 Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.018258 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84cf884f69-szg6z" event={"ID":"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57","Type":"ContainerDied","Data":"55ab00420c35e660dcd2fc4f2fac28967b6659c39aebca83319a65a629905d1c"} Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.018284 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84cf884f69-szg6z" event={"ID":"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57","Type":"ContainerDied","Data":"23bfbed7fa0ae0152f8200c22ca2ddae14f2d10ef420aa2e9402c3ee1fc4ac10"} Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.018302 4786 scope.go:117] "RemoveContainer" containerID="55ab00420c35e660dcd2fc4f2fac28967b6659c39aebca83319a65a629905d1c" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.018363 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84cf884f69-szg6z" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.056143 4786 scope.go:117] "RemoveContainer" containerID="ec16f2d593320b44b6042d17b0dce60d137cef3d55fbc7793e1d76cea73085a5" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.084553 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.100752 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-dns-svc\") pod \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.101236 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpt8q\" (UniqueName: \"kubernetes.io/projected/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-kube-api-access-bpt8q\") pod \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.101312 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-config\") pod \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.101427 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-ovsdbserver-sb\") pod \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.101513 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-ovsdbserver-nb\") pod \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.101615 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-dns-swift-storage-0\") pod \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\" (UID: \"0206ed2e-dfee-4f7e-8a2e-5b2158e69a57\") " Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.106900 4786 scope.go:117] "RemoveContainer" containerID="55ab00420c35e660dcd2fc4f2fac28967b6659c39aebca83319a65a629905d1c" Oct 02 07:01:44 crc kubenswrapper[4786]: E1002 07:01:44.110144 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55ab00420c35e660dcd2fc4f2fac28967b6659c39aebca83319a65a629905d1c\": container with ID starting with 55ab00420c35e660dcd2fc4f2fac28967b6659c39aebca83319a65a629905d1c not found: ID does not exist" containerID="55ab00420c35e660dcd2fc4f2fac28967b6659c39aebca83319a65a629905d1c" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.110184 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55ab00420c35e660dcd2fc4f2fac28967b6659c39aebca83319a65a629905d1c"} err="failed to get container status \"55ab00420c35e660dcd2fc4f2fac28967b6659c39aebca83319a65a629905d1c\": rpc error: code = NotFound desc = could not find container \"55ab00420c35e660dcd2fc4f2fac28967b6659c39aebca83319a65a629905d1c\": container with ID starting with 55ab00420c35e660dcd2fc4f2fac28967b6659c39aebca83319a65a629905d1c not found: ID does not exist" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.110207 4786 scope.go:117] "RemoveContainer" containerID="ec16f2d593320b44b6042d17b0dce60d137cef3d55fbc7793e1d76cea73085a5" Oct 02 07:01:44 crc kubenswrapper[4786]: E1002 07:01:44.111385 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec16f2d593320b44b6042d17b0dce60d137cef3d55fbc7793e1d76cea73085a5\": container with ID starting with ec16f2d593320b44b6042d17b0dce60d137cef3d55fbc7793e1d76cea73085a5 not found: ID does not exist" containerID="ec16f2d593320b44b6042d17b0dce60d137cef3d55fbc7793e1d76cea73085a5" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.111410 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec16f2d593320b44b6042d17b0dce60d137cef3d55fbc7793e1d76cea73085a5"} err="failed to get container status \"ec16f2d593320b44b6042d17b0dce60d137cef3d55fbc7793e1d76cea73085a5\": rpc error: code = NotFound desc = could not find container \"ec16f2d593320b44b6042d17b0dce60d137cef3d55fbc7793e1d76cea73085a5\": container with ID starting with ec16f2d593320b44b6042d17b0dce60d137cef3d55fbc7793e1d76cea73085a5 not found: ID does not exist" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.118759 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-kube-api-access-bpt8q" (OuterVolumeSpecName: "kube-api-access-bpt8q") pod "0206ed2e-dfee-4f7e-8a2e-5b2158e69a57" (UID: "0206ed2e-dfee-4f7e-8a2e-5b2158e69a57"). InnerVolumeSpecName "kube-api-access-bpt8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.148616 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0206ed2e-dfee-4f7e-8a2e-5b2158e69a57" (UID: "0206ed2e-dfee-4f7e-8a2e-5b2158e69a57"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.155828 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0206ed2e-dfee-4f7e-8a2e-5b2158e69a57" (UID: "0206ed2e-dfee-4f7e-8a2e-5b2158e69a57"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.174544 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0206ed2e-dfee-4f7e-8a2e-5b2158e69a57" (UID: "0206ed2e-dfee-4f7e-8a2e-5b2158e69a57"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.177022 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0206ed2e-dfee-4f7e-8a2e-5b2158e69a57" (UID: "0206ed2e-dfee-4f7e-8a2e-5b2158e69a57"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.186886 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-config" (OuterVolumeSpecName: "config") pod "0206ed2e-dfee-4f7e-8a2e-5b2158e69a57" (UID: "0206ed2e-dfee-4f7e-8a2e-5b2158e69a57"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.206339 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.206539 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.206614 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpt8q\" (UniqueName: \"kubernetes.io/projected/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-kube-api-access-bpt8q\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.206756 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.206822 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.206894 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.339904 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84cf884f69-szg6z"] Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.361370 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84cf884f69-szg6z"] Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.450620 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64d9b8cc-868kq"] Oct 02 07:01:44 crc kubenswrapper[4786]: I1002 07:01:44.465465 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 07:01:45 crc kubenswrapper[4786]: I1002 07:01:45.036700 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df8b9476-3526-44a6-8a19-486e005a851d","Type":"ContainerStarted","Data":"73b5ed2d15918c88304a1976c5659070ea84a8dea1ccd56c7ceacfce4232bcd3"} Oct 02 07:01:45 crc kubenswrapper[4786]: I1002 07:01:45.038591 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f","Type":"ContainerStarted","Data":"d32e17ee8dabf0e119212245e1a3267f9c4e8c0afc98a6b741e0ef2fa8c528fc"} Oct 02 07:01:45 crc kubenswrapper[4786]: I1002 07:01:45.040251 4786 generic.go:334] "Generic (PLEG): container finished" podID="aecc9aef-011b-45c9-8e92-0c41524ab191" containerID="d7dcf9a63e95a7e46193a7f83d27b9cadfb29e7d8c4a8995e9be8d361c6c571c" exitCode=0 Oct 02 07:01:45 crc kubenswrapper[4786]: I1002 07:01:45.040293 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64d9b8cc-868kq" event={"ID":"aecc9aef-011b-45c9-8e92-0c41524ab191","Type":"ContainerDied","Data":"d7dcf9a63e95a7e46193a7f83d27b9cadfb29e7d8c4a8995e9be8d361c6c571c"} Oct 02 07:01:45 crc kubenswrapper[4786]: I1002 07:01:45.040336 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64d9b8cc-868kq" event={"ID":"aecc9aef-011b-45c9-8e92-0c41524ab191","Type":"ContainerStarted","Data":"cee2b1ed022a47a6e3fce3f86034c67d93f341f5978bd7752ccd318169ff6cc3"} Oct 02 07:01:45 crc kubenswrapper[4786]: I1002 07:01:45.102361 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:45 crc kubenswrapper[4786]: I1002 07:01:45.222538 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 07:01:45 crc kubenswrapper[4786]: I1002 07:01:45.296902 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:46 crc kubenswrapper[4786]: I1002 07:01:46.077514 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df8b9476-3526-44a6-8a19-486e005a851d","Type":"ContainerStarted","Data":"c9a877d72ec64f9d9967f81dadd9a543de1950dd8ef8a9463fdfc4cb8a7a5d7e"} Oct 02 07:01:46 crc kubenswrapper[4786]: I1002 07:01:46.084347 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f","Type":"ContainerStarted","Data":"ea57ba5e4993418423b8057904af2097d16e89657a7543a43a94fbaf66b31b10"} Oct 02 07:01:46 crc kubenswrapper[4786]: I1002 07:01:46.084399 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f","Type":"ContainerStarted","Data":"8f7d5d813a2fad03f9ea44d9214b8ff70a7c0907b42805cf27641efa6ffa28aa"} Oct 02 07:01:46 crc kubenswrapper[4786]: I1002 07:01:46.084472 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f" containerName="cinder-api-log" containerID="cri-o://8f7d5d813a2fad03f9ea44d9214b8ff70a7c0907b42805cf27641efa6ffa28aa" gracePeriod=30 Oct 02 07:01:46 crc kubenswrapper[4786]: I1002 07:01:46.084530 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 02 07:01:46 crc kubenswrapper[4786]: I1002 07:01:46.084568 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f" containerName="cinder-api" containerID="cri-o://ea57ba5e4993418423b8057904af2097d16e89657a7543a43a94fbaf66b31b10" gracePeriod=30 Oct 02 07:01:46 crc kubenswrapper[4786]: I1002 07:01:46.088966 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88561e6d-11c9-481c-b10c-db39b7291394","Type":"ContainerStarted","Data":"c37256cf1676ad2dba59e68ebb7666ddcc54aa623546fb5e0c875521cb4ce14d"} Oct 02 07:01:46 crc kubenswrapper[4786]: I1002 07:01:46.092486 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64d9b8cc-868kq" event={"ID":"aecc9aef-011b-45c9-8e92-0c41524ab191","Type":"ContainerStarted","Data":"fe33dd8388163aa396dde7e46baad4fac4dcdbfbb62f9cadc7a1a1dcfb291122"} Oct 02 07:01:46 crc kubenswrapper[4786]: I1002 07:01:46.107020 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.107003568 podStartE2EDuration="3.107003568s" podCreationTimestamp="2025-10-02 07:01:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:46.100365646 +0000 UTC m=+916.221548777" watchObservedRunningTime="2025-10-02 07:01:46.107003568 +0000 UTC m=+916.228186699" Oct 02 07:01:46 crc kubenswrapper[4786]: I1002 07:01:46.121283 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64d9b8cc-868kq" podStartSLOduration=3.121269367 podStartE2EDuration="3.121269367s" podCreationTimestamp="2025-10-02 07:01:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:46.119627512 +0000 UTC m=+916.240810653" watchObservedRunningTime="2025-10-02 07:01:46.121269367 +0000 UTC m=+916.242452497" Oct 02 07:01:46 crc kubenswrapper[4786]: I1002 07:01:46.194203 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0206ed2e-dfee-4f7e-8a2e-5b2158e69a57" path="/var/lib/kubelet/pods/0206ed2e-dfee-4f7e-8a2e-5b2158e69a57/volumes" Oct 02 07:01:47 crc kubenswrapper[4786]: I1002 07:01:47.105868 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df8b9476-3526-44a6-8a19-486e005a851d","Type":"ContainerStarted","Data":"c9f6d525107dd27850c86f77865cc9ba693077b3d6cf448bc5da12e619ee0693"} Oct 02 07:01:47 crc kubenswrapper[4786]: I1002 07:01:47.109370 4786 generic.go:334] "Generic (PLEG): container finished" podID="3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f" containerID="8f7d5d813a2fad03f9ea44d9214b8ff70a7c0907b42805cf27641efa6ffa28aa" exitCode=143 Oct 02 07:01:47 crc kubenswrapper[4786]: I1002 07:01:47.109455 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f","Type":"ContainerDied","Data":"8f7d5d813a2fad03f9ea44d9214b8ff70a7c0907b42805cf27641efa6ffa28aa"} Oct 02 07:01:47 crc kubenswrapper[4786]: I1002 07:01:47.111796 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88561e6d-11c9-481c-b10c-db39b7291394","Type":"ContainerStarted","Data":"3584a8e1047614618d3dcf045aae727148d4fb2183bffd8846358b3674a6a180"} Oct 02 07:01:47 crc kubenswrapper[4786]: I1002 07:01:47.112053 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:47 crc kubenswrapper[4786]: I1002 07:01:47.120628 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.067313226 podStartE2EDuration="4.120613934s" podCreationTimestamp="2025-10-02 07:01:43 +0000 UTC" firstStartedPulling="2025-10-02 07:01:44.120417854 +0000 UTC m=+914.241600984" lastFinishedPulling="2025-10-02 07:01:45.173718572 +0000 UTC m=+915.294901692" observedRunningTime="2025-10-02 07:01:47.120252223 +0000 UTC m=+917.241435353" watchObservedRunningTime="2025-10-02 07:01:47.120613934 +0000 UTC m=+917.241797066" Oct 02 07:01:48 crc kubenswrapper[4786]: I1002 07:01:48.127209 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88561e6d-11c9-481c-b10c-db39b7291394","Type":"ContainerStarted","Data":"2911db3f9727bf76b333b196b5aaf4e2e6b2143417ac0faac92a24aa73f5c022"} Oct 02 07:01:48 crc kubenswrapper[4786]: I1002 07:01:48.127667 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 07:01:48 crc kubenswrapper[4786]: I1002 07:01:48.528105 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:48 crc kubenswrapper[4786]: I1002 07:01:48.533921 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 02 07:01:48 crc kubenswrapper[4786]: I1002 07:01:48.547450 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.460192356 podStartE2EDuration="6.54743315s" podCreationTimestamp="2025-10-02 07:01:42 +0000 UTC" firstStartedPulling="2025-10-02 07:01:42.78426426 +0000 UTC m=+912.905447391" lastFinishedPulling="2025-10-02 07:01:47.871505054 +0000 UTC m=+917.992688185" observedRunningTime="2025-10-02 07:01:48.162825191 +0000 UTC m=+918.284008332" watchObservedRunningTime="2025-10-02 07:01:48.54743315 +0000 UTC m=+918.668616281" Oct 02 07:01:48 crc kubenswrapper[4786]: I1002 07:01:48.669664 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6c4f87b654-hv865" Oct 02 07:01:48 crc kubenswrapper[4786]: I1002 07:01:48.730853 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5c598894db-vwfzw"] Oct 02 07:01:48 crc kubenswrapper[4786]: I1002 07:01:48.731035 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5c598894db-vwfzw" podUID="36d65eb2-c554-4ca7-a21a-16375fcbd118" containerName="barbican-api-log" containerID="cri-o://ab32ae0c1759456161e1b96fd1154c1f08e8404006af167e359ed41c34533269" gracePeriod=30 Oct 02 07:01:48 crc kubenswrapper[4786]: I1002 07:01:48.731178 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5c598894db-vwfzw" podUID="36d65eb2-c554-4ca7-a21a-16375fcbd118" containerName="barbican-api" containerID="cri-o://1820a9e6d371f32d52d64ca297b3853183aa70835c90ee39ebb6325567c98515" gracePeriod=30 Oct 02 07:01:48 crc kubenswrapper[4786]: I1002 07:01:48.747705 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c598894db-vwfzw" podUID="36d65eb2-c554-4ca7-a21a-16375fcbd118" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": EOF" Oct 02 07:01:49 crc kubenswrapper[4786]: I1002 07:01:49.136541 4786 generic.go:334] "Generic (PLEG): container finished" podID="36d65eb2-c554-4ca7-a21a-16375fcbd118" containerID="ab32ae0c1759456161e1b96fd1154c1f08e8404006af167e359ed41c34533269" exitCode=143 Oct 02 07:01:49 crc kubenswrapper[4786]: I1002 07:01:49.136635 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c598894db-vwfzw" event={"ID":"36d65eb2-c554-4ca7-a21a-16375fcbd118","Type":"ContainerDied","Data":"ab32ae0c1759456161e1b96fd1154c1f08e8404006af167e359ed41c34533269"} Oct 02 07:01:49 crc kubenswrapper[4786]: I1002 07:01:49.366141 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:49 crc kubenswrapper[4786]: I1002 07:01:49.416140 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8c555dcd8-dbpk5" Oct 02 07:01:50 crc kubenswrapper[4786]: I1002 07:01:50.209955 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 07:01:50 crc kubenswrapper[4786]: I1002 07:01:50.276872 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 07:01:50 crc kubenswrapper[4786]: I1002 07:01:50.352209 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 07:01:50 crc kubenswrapper[4786]: I1002 07:01:50.356957 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 07:01:50 crc kubenswrapper[4786]: I1002 07:01:50.528538 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-749d95758-7hp9f" Oct 02 07:01:50 crc kubenswrapper[4786]: I1002 07:01:50.914783 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 02 07:01:50 crc kubenswrapper[4786]: E1002 07:01:50.915172 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0206ed2e-dfee-4f7e-8a2e-5b2158e69a57" containerName="dnsmasq-dns" Oct 02 07:01:50 crc kubenswrapper[4786]: I1002 07:01:50.915198 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0206ed2e-dfee-4f7e-8a2e-5b2158e69a57" containerName="dnsmasq-dns" Oct 02 07:01:50 crc kubenswrapper[4786]: E1002 07:01:50.915222 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0206ed2e-dfee-4f7e-8a2e-5b2158e69a57" containerName="init" Oct 02 07:01:50 crc kubenswrapper[4786]: I1002 07:01:50.915228 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0206ed2e-dfee-4f7e-8a2e-5b2158e69a57" containerName="init" Oct 02 07:01:50 crc kubenswrapper[4786]: I1002 07:01:50.915429 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0206ed2e-dfee-4f7e-8a2e-5b2158e69a57" containerName="dnsmasq-dns" Oct 02 07:01:50 crc kubenswrapper[4786]: I1002 07:01:50.916034 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 07:01:50 crc kubenswrapper[4786]: I1002 07:01:50.917921 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 02 07:01:50 crc kubenswrapper[4786]: I1002 07:01:50.918194 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-mxhw7" Oct 02 07:01:50 crc kubenswrapper[4786]: I1002 07:01:50.918728 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 02 07:01:50 crc kubenswrapper[4786]: I1002 07:01:50.923736 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 07:01:50 crc kubenswrapper[4786]: I1002 07:01:50.949457 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f57411f4-01bc-4ecc-a240-9101e861f97d-openstack-config\") pod \"openstackclient\" (UID: \"f57411f4-01bc-4ecc-a240-9101e861f97d\") " pod="openstack/openstackclient" Oct 02 07:01:50 crc kubenswrapper[4786]: I1002 07:01:50.949708 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f57411f4-01bc-4ecc-a240-9101e861f97d-openstack-config-secret\") pod \"openstackclient\" (UID: \"f57411f4-01bc-4ecc-a240-9101e861f97d\") " pod="openstack/openstackclient" Oct 02 07:01:50 crc kubenswrapper[4786]: I1002 07:01:50.949783 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjqcv\" (UniqueName: \"kubernetes.io/projected/f57411f4-01bc-4ecc-a240-9101e861f97d-kube-api-access-xjqcv\") pod \"openstackclient\" (UID: \"f57411f4-01bc-4ecc-a240-9101e861f97d\") " pod="openstack/openstackclient" Oct 02 07:01:50 crc kubenswrapper[4786]: I1002 07:01:50.949812 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f57411f4-01bc-4ecc-a240-9101e861f97d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f57411f4-01bc-4ecc-a240-9101e861f97d\") " pod="openstack/openstackclient" Oct 02 07:01:51 crc kubenswrapper[4786]: I1002 07:01:51.051109 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f57411f4-01bc-4ecc-a240-9101e861f97d-openstack-config\") pod \"openstackclient\" (UID: \"f57411f4-01bc-4ecc-a240-9101e861f97d\") " pod="openstack/openstackclient" Oct 02 07:01:51 crc kubenswrapper[4786]: I1002 07:01:51.051187 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f57411f4-01bc-4ecc-a240-9101e861f97d-openstack-config-secret\") pod \"openstackclient\" (UID: \"f57411f4-01bc-4ecc-a240-9101e861f97d\") " pod="openstack/openstackclient" Oct 02 07:01:51 crc kubenswrapper[4786]: I1002 07:01:51.051211 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f57411f4-01bc-4ecc-a240-9101e861f97d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f57411f4-01bc-4ecc-a240-9101e861f97d\") " pod="openstack/openstackclient" Oct 02 07:01:51 crc kubenswrapper[4786]: I1002 07:01:51.051901 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f57411f4-01bc-4ecc-a240-9101e861f97d-openstack-config\") pod \"openstackclient\" (UID: \"f57411f4-01bc-4ecc-a240-9101e861f97d\") " pod="openstack/openstackclient" Oct 02 07:01:51 crc kubenswrapper[4786]: I1002 07:01:51.051932 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjqcv\" (UniqueName: \"kubernetes.io/projected/f57411f4-01bc-4ecc-a240-9101e861f97d-kube-api-access-xjqcv\") pod \"openstackclient\" (UID: \"f57411f4-01bc-4ecc-a240-9101e861f97d\") " pod="openstack/openstackclient" Oct 02 07:01:51 crc kubenswrapper[4786]: I1002 07:01:51.058195 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f57411f4-01bc-4ecc-a240-9101e861f97d-openstack-config-secret\") pod \"openstackclient\" (UID: \"f57411f4-01bc-4ecc-a240-9101e861f97d\") " pod="openstack/openstackclient" Oct 02 07:01:51 crc kubenswrapper[4786]: I1002 07:01:51.058759 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f57411f4-01bc-4ecc-a240-9101e861f97d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f57411f4-01bc-4ecc-a240-9101e861f97d\") " pod="openstack/openstackclient" Oct 02 07:01:51 crc kubenswrapper[4786]: I1002 07:01:51.065262 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjqcv\" (UniqueName: \"kubernetes.io/projected/f57411f4-01bc-4ecc-a240-9101e861f97d-kube-api-access-xjqcv\") pod \"openstackclient\" (UID: \"f57411f4-01bc-4ecc-a240-9101e861f97d\") " pod="openstack/openstackclient" Oct 02 07:01:51 crc kubenswrapper[4786]: I1002 07:01:51.235130 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 07:01:51 crc kubenswrapper[4786]: I1002 07:01:51.643806 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 07:01:51 crc kubenswrapper[4786]: W1002 07:01:51.654001 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf57411f4_01bc_4ecc_a240_9101e861f97d.slice/crio-789d035a9f79f391f48d9eb34015820b8e6c062bb4386625c7771acf258c6ea1 WatchSource:0}: Error finding container 789d035a9f79f391f48d9eb34015820b8e6c062bb4386625c7771acf258c6ea1: Status 404 returned error can't find the container with id 789d035a9f79f391f48d9eb34015820b8e6c062bb4386625c7771acf258c6ea1 Oct 02 07:01:52 crc kubenswrapper[4786]: I1002 07:01:52.158959 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f57411f4-01bc-4ecc-a240-9101e861f97d","Type":"ContainerStarted","Data":"789d035a9f79f391f48d9eb34015820b8e6c062bb4386625c7771acf258c6ea1"} Oct 02 07:01:53 crc kubenswrapper[4786]: I1002 07:01:53.791463 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 02 07:01:53 crc kubenswrapper[4786]: I1002 07:01:53.840923 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:01:53 crc kubenswrapper[4786]: I1002 07:01:53.842724 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 07:01:53 crc kubenswrapper[4786]: I1002 07:01:53.902221 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5f8875c-7pgzk"] Oct 02 07:01:53 crc kubenswrapper[4786]: I1002 07:01:53.902469 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" podUID="861e1d0e-bfec-4adf-8712-5ff000e0cf87" containerName="dnsmasq-dns" containerID="cri-o://4a3d0243946f1570af7ae9bcc8bee9f2a50855ed511e60eccca286ce8b6668e1" gracePeriod=10 Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.156942 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c598894db-vwfzw" podUID="36d65eb2-c554-4ca7-a21a-16375fcbd118" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": read tcp 10.217.0.2:53324->10.217.0.153:9311: read: connection reset by peer" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.156950 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c598894db-vwfzw" podUID="36d65eb2-c554-4ca7-a21a-16375fcbd118" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.153:9311/healthcheck\": read tcp 10.217.0.2:53336->10.217.0.153:9311: read: connection reset by peer" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.187838 4786 generic.go:334] "Generic (PLEG): container finished" podID="861e1d0e-bfec-4adf-8712-5ff000e0cf87" containerID="4a3d0243946f1570af7ae9bcc8bee9f2a50855ed511e60eccca286ce8b6668e1" exitCode=0 Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.188068 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="df8b9476-3526-44a6-8a19-486e005a851d" containerName="cinder-scheduler" containerID="cri-o://c9a877d72ec64f9d9967f81dadd9a543de1950dd8ef8a9463fdfc4cb8a7a5d7e" gracePeriod=30 Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.188201 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="df8b9476-3526-44a6-8a19-486e005a851d" containerName="probe" containerID="cri-o://c9f6d525107dd27850c86f77865cc9ba693077b3d6cf448bc5da12e619ee0693" gracePeriod=30 Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.195376 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" event={"ID":"861e1d0e-bfec-4adf-8712-5ff000e0cf87","Type":"ContainerDied","Data":"4a3d0243946f1570af7ae9bcc8bee9f2a50855ed511e60eccca286ce8b6668e1"} Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.363891 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.497458 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7dd886698c-blkwg"] Oct 02 07:01:54 crc kubenswrapper[4786]: E1002 07:01:54.497887 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861e1d0e-bfec-4adf-8712-5ff000e0cf87" containerName="dnsmasq-dns" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.497900 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="861e1d0e-bfec-4adf-8712-5ff000e0cf87" containerName="dnsmasq-dns" Oct 02 07:01:54 crc kubenswrapper[4786]: E1002 07:01:54.497928 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861e1d0e-bfec-4adf-8712-5ff000e0cf87" containerName="init" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.497934 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="861e1d0e-bfec-4adf-8712-5ff000e0cf87" containerName="init" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.498103 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="861e1d0e-bfec-4adf-8712-5ff000e0cf87" containerName="dnsmasq-dns" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.498982 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.503927 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.504149 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.511804 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7dd886698c-blkwg"] Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.513779 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.532565 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-config\") pod \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.532621 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-ovsdbserver-sb\") pod \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.532667 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-dns-svc\") pod \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.532748 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-ovsdbserver-nb\") pod \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.532891 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqwx7\" (UniqueName: \"kubernetes.io/projected/861e1d0e-bfec-4adf-8712-5ff000e0cf87-kube-api-access-hqwx7\") pod \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.532988 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-dns-swift-storage-0\") pod \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\" (UID: \"861e1d0e-bfec-4adf-8712-5ff000e0cf87\") " Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.546923 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/861e1d0e-bfec-4adf-8712-5ff000e0cf87-kube-api-access-hqwx7" (OuterVolumeSpecName: "kube-api-access-hqwx7") pod "861e1d0e-bfec-4adf-8712-5ff000e0cf87" (UID: "861e1d0e-bfec-4adf-8712-5ff000e0cf87"). InnerVolumeSpecName "kube-api-access-hqwx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.580899 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "861e1d0e-bfec-4adf-8712-5ff000e0cf87" (UID: "861e1d0e-bfec-4adf-8712-5ff000e0cf87"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.589381 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-config" (OuterVolumeSpecName: "config") pod "861e1d0e-bfec-4adf-8712-5ff000e0cf87" (UID: "861e1d0e-bfec-4adf-8712-5ff000e0cf87"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.600203 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "861e1d0e-bfec-4adf-8712-5ff000e0cf87" (UID: "861e1d0e-bfec-4adf-8712-5ff000e0cf87"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.611923 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "861e1d0e-bfec-4adf-8712-5ff000e0cf87" (UID: "861e1d0e-bfec-4adf-8712-5ff000e0cf87"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.612736 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "861e1d0e-bfec-4adf-8712-5ff000e0cf87" (UID: "861e1d0e-bfec-4adf-8712-5ff000e0cf87"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.625821 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.639057 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jpkh\" (UniqueName: \"kubernetes.io/projected/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-kube-api-access-2jpkh\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.639117 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-log-httpd\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.639151 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-public-tls-certs\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.639223 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-etc-swift\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.639477 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-config-data\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.639731 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-combined-ca-bundle\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.639827 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-internal-tls-certs\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.639848 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-run-httpd\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.640022 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.640041 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.640054 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqwx7\" (UniqueName: \"kubernetes.io/projected/861e1d0e-bfec-4adf-8712-5ff000e0cf87-kube-api-access-hqwx7\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.640063 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.640071 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.640079 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/861e1d0e-bfec-4adf-8712-5ff000e0cf87-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.741350 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36d65eb2-c554-4ca7-a21a-16375fcbd118-combined-ca-bundle\") pod \"36d65eb2-c554-4ca7-a21a-16375fcbd118\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.741454 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36d65eb2-c554-4ca7-a21a-16375fcbd118-config-data-custom\") pod \"36d65eb2-c554-4ca7-a21a-16375fcbd118\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.741529 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99q6g\" (UniqueName: \"kubernetes.io/projected/36d65eb2-c554-4ca7-a21a-16375fcbd118-kube-api-access-99q6g\") pod \"36d65eb2-c554-4ca7-a21a-16375fcbd118\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.741570 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36d65eb2-c554-4ca7-a21a-16375fcbd118-logs\") pod \"36d65eb2-c554-4ca7-a21a-16375fcbd118\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.741598 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36d65eb2-c554-4ca7-a21a-16375fcbd118-config-data\") pod \"36d65eb2-c554-4ca7-a21a-16375fcbd118\" (UID: \"36d65eb2-c554-4ca7-a21a-16375fcbd118\") " Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.741847 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jpkh\" (UniqueName: \"kubernetes.io/projected/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-kube-api-access-2jpkh\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.741878 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-log-httpd\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.741901 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-public-tls-certs\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.741949 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-etc-swift\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.742010 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-config-data\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.742104 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-combined-ca-bundle\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.742153 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-internal-tls-certs\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.742175 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-run-httpd\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.742567 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-run-httpd\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.742862 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d65eb2-c554-4ca7-a21a-16375fcbd118-logs" (OuterVolumeSpecName: "logs") pod "36d65eb2-c554-4ca7-a21a-16375fcbd118" (UID: "36d65eb2-c554-4ca7-a21a-16375fcbd118"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.744885 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-log-httpd\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.746647 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d65eb2-c554-4ca7-a21a-16375fcbd118-kube-api-access-99q6g" (OuterVolumeSpecName: "kube-api-access-99q6g") pod "36d65eb2-c554-4ca7-a21a-16375fcbd118" (UID: "36d65eb2-c554-4ca7-a21a-16375fcbd118"). InnerVolumeSpecName "kube-api-access-99q6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.746845 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d65eb2-c554-4ca7-a21a-16375fcbd118-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "36d65eb2-c554-4ca7-a21a-16375fcbd118" (UID: "36d65eb2-c554-4ca7-a21a-16375fcbd118"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.749172 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-public-tls-certs\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.749406 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-internal-tls-certs\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.753312 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-config-data\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.753548 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-etc-swift\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.753930 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-combined-ca-bundle\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.759922 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jpkh\" (UniqueName: \"kubernetes.io/projected/45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a-kube-api-access-2jpkh\") pod \"swift-proxy-7dd886698c-blkwg\" (UID: \"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a\") " pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.764231 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d65eb2-c554-4ca7-a21a-16375fcbd118-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36d65eb2-c554-4ca7-a21a-16375fcbd118" (UID: "36d65eb2-c554-4ca7-a21a-16375fcbd118"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.787217 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d65eb2-c554-4ca7-a21a-16375fcbd118-config-data" (OuterVolumeSpecName: "config-data") pod "36d65eb2-c554-4ca7-a21a-16375fcbd118" (UID: "36d65eb2-c554-4ca7-a21a-16375fcbd118"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.846650 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36d65eb2-c554-4ca7-a21a-16375fcbd118-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.846727 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99q6g\" (UniqueName: \"kubernetes.io/projected/36d65eb2-c554-4ca7-a21a-16375fcbd118-kube-api-access-99q6g\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.846741 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36d65eb2-c554-4ca7-a21a-16375fcbd118-logs\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.846750 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36d65eb2-c554-4ca7-a21a-16375fcbd118-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.846759 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36d65eb2-c554-4ca7-a21a-16375fcbd118-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:54 crc kubenswrapper[4786]: I1002 07:01:54.853067 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.200870 4786 generic.go:334] "Generic (PLEG): container finished" podID="df8b9476-3526-44a6-8a19-486e005a851d" containerID="c9f6d525107dd27850c86f77865cc9ba693077b3d6cf448bc5da12e619ee0693" exitCode=0 Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.201246 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df8b9476-3526-44a6-8a19-486e005a851d","Type":"ContainerDied","Data":"c9f6d525107dd27850c86f77865cc9ba693077b3d6cf448bc5da12e619ee0693"} Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.212223 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-77z8r"] Oct 02 07:01:55 crc kubenswrapper[4786]: E1002 07:01:55.212512 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d65eb2-c554-4ca7-a21a-16375fcbd118" containerName="barbican-api" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.212529 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d65eb2-c554-4ca7-a21a-16375fcbd118" containerName="barbican-api" Oct 02 07:01:55 crc kubenswrapper[4786]: E1002 07:01:55.212570 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d65eb2-c554-4ca7-a21a-16375fcbd118" containerName="barbican-api-log" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.212577 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d65eb2-c554-4ca7-a21a-16375fcbd118" containerName="barbican-api-log" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.212742 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d65eb2-c554-4ca7-a21a-16375fcbd118" containerName="barbican-api-log" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.212763 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d65eb2-c554-4ca7-a21a-16375fcbd118" containerName="barbican-api" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.213222 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-77z8r" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.219559 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" event={"ID":"861e1d0e-bfec-4adf-8712-5ff000e0cf87","Type":"ContainerDied","Data":"46804c0869411976fb54fa0a3de4024c50ddd542aaf0258ff66f4bc6463a9b6d"} Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.219595 4786 scope.go:117] "RemoveContainer" containerID="4a3d0243946f1570af7ae9bcc8bee9f2a50855ed511e60eccca286ce8b6668e1" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.219713 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5f8875c-7pgzk" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.239490 4786 generic.go:334] "Generic (PLEG): container finished" podID="36d65eb2-c554-4ca7-a21a-16375fcbd118" containerID="1820a9e6d371f32d52d64ca297b3853183aa70835c90ee39ebb6325567c98515" exitCode=0 Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.239783 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c598894db-vwfzw" event={"ID":"36d65eb2-c554-4ca7-a21a-16375fcbd118","Type":"ContainerDied","Data":"1820a9e6d371f32d52d64ca297b3853183aa70835c90ee39ebb6325567c98515"} Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.239819 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c598894db-vwfzw" event={"ID":"36d65eb2-c554-4ca7-a21a-16375fcbd118","Type":"ContainerDied","Data":"0ffb17f61d81dec1353b34bb85f414135293be0aee3f94dbb70f249e6e45c068"} Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.240137 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c598894db-vwfzw" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.248666 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-77z8r"] Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.273824 4786 scope.go:117] "RemoveContainer" containerID="c74f50a4a3e6f120af84bf422ef80846867a5a412b781f302fa1692f0b9f2d38" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.293192 4786 scope.go:117] "RemoveContainer" containerID="1820a9e6d371f32d52d64ca297b3853183aa70835c90ee39ebb6325567c98515" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.315667 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5f8875c-7pgzk"] Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.330906 4786 scope.go:117] "RemoveContainer" containerID="ab32ae0c1759456161e1b96fd1154c1f08e8404006af167e359ed41c34533269" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.336548 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5f8875c-7pgzk"] Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.360140 4786 scope.go:117] "RemoveContainer" containerID="1820a9e6d371f32d52d64ca297b3853183aa70835c90ee39ebb6325567c98515" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.361458 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6tnt\" (UniqueName: \"kubernetes.io/projected/cd800405-6ab0-42b0-b1c3-b9667275a541-kube-api-access-t6tnt\") pod \"nova-api-db-create-77z8r\" (UID: \"cd800405-6ab0-42b0-b1c3-b9667275a541\") " pod="openstack/nova-api-db-create-77z8r" Oct 02 07:01:55 crc kubenswrapper[4786]: E1002 07:01:55.361581 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1820a9e6d371f32d52d64ca297b3853183aa70835c90ee39ebb6325567c98515\": container with ID starting with 1820a9e6d371f32d52d64ca297b3853183aa70835c90ee39ebb6325567c98515 not found: ID does not exist" containerID="1820a9e6d371f32d52d64ca297b3853183aa70835c90ee39ebb6325567c98515" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.361608 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1820a9e6d371f32d52d64ca297b3853183aa70835c90ee39ebb6325567c98515"} err="failed to get container status \"1820a9e6d371f32d52d64ca297b3853183aa70835c90ee39ebb6325567c98515\": rpc error: code = NotFound desc = could not find container \"1820a9e6d371f32d52d64ca297b3853183aa70835c90ee39ebb6325567c98515\": container with ID starting with 1820a9e6d371f32d52d64ca297b3853183aa70835c90ee39ebb6325567c98515 not found: ID does not exist" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.361629 4786 scope.go:117] "RemoveContainer" containerID="ab32ae0c1759456161e1b96fd1154c1f08e8404006af167e359ed41c34533269" Oct 02 07:01:55 crc kubenswrapper[4786]: E1002 07:01:55.361903 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab32ae0c1759456161e1b96fd1154c1f08e8404006af167e359ed41c34533269\": container with ID starting with ab32ae0c1759456161e1b96fd1154c1f08e8404006af167e359ed41c34533269 not found: ID does not exist" containerID="ab32ae0c1759456161e1b96fd1154c1f08e8404006af167e359ed41c34533269" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.361922 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab32ae0c1759456161e1b96fd1154c1f08e8404006af167e359ed41c34533269"} err="failed to get container status \"ab32ae0c1759456161e1b96fd1154c1f08e8404006af167e359ed41c34533269\": rpc error: code = NotFound desc = could not find container \"ab32ae0c1759456161e1b96fd1154c1f08e8404006af167e359ed41c34533269\": container with ID starting with ab32ae0c1759456161e1b96fd1154c1f08e8404006af167e359ed41c34533269 not found: ID does not exist" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.365159 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-hx78d"] Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.366667 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hx78d" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.372625 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-hx78d"] Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.379683 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5c598894db-vwfzw"] Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.386124 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5c598894db-vwfzw"] Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.391884 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7dd886698c-blkwg"] Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.425296 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-wssnr"] Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.426441 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wssnr" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.432958 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wssnr"] Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.464404 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6tnt\" (UniqueName: \"kubernetes.io/projected/cd800405-6ab0-42b0-b1c3-b9667275a541-kube-api-access-t6tnt\") pod \"nova-api-db-create-77z8r\" (UID: \"cd800405-6ab0-42b0-b1c3-b9667275a541\") " pod="openstack/nova-api-db-create-77z8r" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.464453 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g7mf\" (UniqueName: \"kubernetes.io/projected/bcad0a6b-e52d-463e-a961-0f081f4c0993-kube-api-access-4g7mf\") pod \"nova-cell0-db-create-hx78d\" (UID: \"bcad0a6b-e52d-463e-a961-0f081f4c0993\") " pod="openstack/nova-cell0-db-create-hx78d" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.479001 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6tnt\" (UniqueName: \"kubernetes.io/projected/cd800405-6ab0-42b0-b1c3-b9667275a541-kube-api-access-t6tnt\") pod \"nova-api-db-create-77z8r\" (UID: \"cd800405-6ab0-42b0-b1c3-b9667275a541\") " pod="openstack/nova-api-db-create-77z8r" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.522025 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.546138 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-77z8r" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.566398 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g7mf\" (UniqueName: \"kubernetes.io/projected/bcad0a6b-e52d-463e-a961-0f081f4c0993-kube-api-access-4g7mf\") pod \"nova-cell0-db-create-hx78d\" (UID: \"bcad0a6b-e52d-463e-a961-0f081f4c0993\") " pod="openstack/nova-cell0-db-create-hx78d" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.566448 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql54v\" (UniqueName: \"kubernetes.io/projected/aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae-kube-api-access-ql54v\") pod \"nova-cell1-db-create-wssnr\" (UID: \"aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae\") " pod="openstack/nova-cell1-db-create-wssnr" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.583440 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g7mf\" (UniqueName: \"kubernetes.io/projected/bcad0a6b-e52d-463e-a961-0f081f4c0993-kube-api-access-4g7mf\") pod \"nova-cell0-db-create-hx78d\" (UID: \"bcad0a6b-e52d-463e-a961-0f081f4c0993\") " pod="openstack/nova-cell0-db-create-hx78d" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.669095 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql54v\" (UniqueName: \"kubernetes.io/projected/aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae-kube-api-access-ql54v\") pod \"nova-cell1-db-create-wssnr\" (UID: \"aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae\") " pod="openstack/nova-cell1-db-create-wssnr" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.684885 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hx78d" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.686618 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql54v\" (UniqueName: \"kubernetes.io/projected/aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae-kube-api-access-ql54v\") pod \"nova-cell1-db-create-wssnr\" (UID: \"aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae\") " pod="openstack/nova-cell1-db-create-wssnr" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.797328 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.807061 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wssnr" Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.809776 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.809981 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88561e6d-11c9-481c-b10c-db39b7291394" containerName="ceilometer-central-agent" containerID="cri-o://ae9d1b6ebd0d0c544fe25da4b98dc1e992ee9b4a8b396564e96759437f102b3e" gracePeriod=30 Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.810078 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88561e6d-11c9-481c-b10c-db39b7291394" containerName="ceilometer-notification-agent" containerID="cri-o://c37256cf1676ad2dba59e68ebb7666ddcc54aa623546fb5e0c875521cb4ce14d" gracePeriod=30 Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.810096 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88561e6d-11c9-481c-b10c-db39b7291394" containerName="proxy-httpd" containerID="cri-o://2911db3f9727bf76b333b196b5aaf4e2e6b2143417ac0faac92a24aa73f5c022" gracePeriod=30 Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.810077 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="88561e6d-11c9-481c-b10c-db39b7291394" containerName="sg-core" containerID="cri-o://3584a8e1047614618d3dcf045aae727148d4fb2183bffd8846358b3674a6a180" gracePeriod=30 Oct 02 07:01:55 crc kubenswrapper[4786]: I1002 07:01:55.974346 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-77z8r"] Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.143181 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wssnr"] Oct 02 07:01:56 crc kubenswrapper[4786]: W1002 07:01:56.200599 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa2dceb4_1028_4ca1_972d_2f0b13bfc5ae.slice/crio-37b0e396cfd2eb2d43dd865e30b86d81f744fbd8d0713d2741c4224d3c484ddb WatchSource:0}: Error finding container 37b0e396cfd2eb2d43dd865e30b86d81f744fbd8d0713d2741c4224d3c484ddb: Status 404 returned error can't find the container with id 37b0e396cfd2eb2d43dd865e30b86d81f744fbd8d0713d2741c4224d3c484ddb Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.203684 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36d65eb2-c554-4ca7-a21a-16375fcbd118" path="/var/lib/kubelet/pods/36d65eb2-c554-4ca7-a21a-16375fcbd118/volumes" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.204362 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="861e1d0e-bfec-4adf-8712-5ff000e0cf87" path="/var/lib/kubelet/pods/861e1d0e-bfec-4adf-8712-5ff000e0cf87/volumes" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.204885 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-hx78d"] Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.256648 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7dd886698c-blkwg" event={"ID":"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a","Type":"ContainerStarted","Data":"e9cf27d95f0be81ab28eb2750bd5ef6e9d4f44d48daac5b4cfcff1bcef9dd4d8"} Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.256700 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7dd886698c-blkwg" event={"ID":"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a","Type":"ContainerStarted","Data":"66553f9cf50dc3d584b30dd68fb16a897d8c502d9d2a0ea84907e87b899518ef"} Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.256714 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7dd886698c-blkwg" event={"ID":"45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a","Type":"ContainerStarted","Data":"132ecea4f09cd34291102ec84aa9f02baa5b1d2cddd0340713d7f08a61ea258b"} Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.257485 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.257611 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.270070 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-77z8r" event={"ID":"cd800405-6ab0-42b0-b1c3-b9667275a541","Type":"ContainerStarted","Data":"dfb562197a3d7dd15e83746d785a1295c829abf201fa4e2041d75bdcda91e622"} Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.275039 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7dd886698c-blkwg" podStartSLOduration=2.27502469 podStartE2EDuration="2.27502469s" podCreationTimestamp="2025-10-02 07:01:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:01:56.272683367 +0000 UTC m=+926.393866508" watchObservedRunningTime="2025-10-02 07:01:56.27502469 +0000 UTC m=+926.396207822" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.279755 4786 generic.go:334] "Generic (PLEG): container finished" podID="88561e6d-11c9-481c-b10c-db39b7291394" containerID="2911db3f9727bf76b333b196b5aaf4e2e6b2143417ac0faac92a24aa73f5c022" exitCode=0 Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.279783 4786 generic.go:334] "Generic (PLEG): container finished" podID="88561e6d-11c9-481c-b10c-db39b7291394" containerID="3584a8e1047614618d3dcf045aae727148d4fb2183bffd8846358b3674a6a180" exitCode=2 Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.279792 4786 generic.go:334] "Generic (PLEG): container finished" podID="88561e6d-11c9-481c-b10c-db39b7291394" containerID="ae9d1b6ebd0d0c544fe25da4b98dc1e992ee9b4a8b396564e96759437f102b3e" exitCode=0 Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.279827 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88561e6d-11c9-481c-b10c-db39b7291394","Type":"ContainerDied","Data":"2911db3f9727bf76b333b196b5aaf4e2e6b2143417ac0faac92a24aa73f5c022"} Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.279847 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88561e6d-11c9-481c-b10c-db39b7291394","Type":"ContainerDied","Data":"3584a8e1047614618d3dcf045aae727148d4fb2183bffd8846358b3674a6a180"} Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.279857 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88561e6d-11c9-481c-b10c-db39b7291394","Type":"ContainerDied","Data":"ae9d1b6ebd0d0c544fe25da4b98dc1e992ee9b4a8b396564e96759437f102b3e"} Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.281580 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wssnr" event={"ID":"aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae","Type":"ContainerStarted","Data":"37b0e396cfd2eb2d43dd865e30b86d81f744fbd8d0713d2741c4224d3c484ddb"} Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.285819 4786 generic.go:334] "Generic (PLEG): container finished" podID="df8b9476-3526-44a6-8a19-486e005a851d" containerID="c9a877d72ec64f9d9967f81dadd9a543de1950dd8ef8a9463fdfc4cb8a7a5d7e" exitCode=0 Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.285880 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df8b9476-3526-44a6-8a19-486e005a851d","Type":"ContainerDied","Data":"c9a877d72ec64f9d9967f81dadd9a543de1950dd8ef8a9463fdfc4cb8a7a5d7e"} Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.532593 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.605851 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-scripts\") pod \"df8b9476-3526-44a6-8a19-486e005a851d\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.605930 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhg4n\" (UniqueName: \"kubernetes.io/projected/df8b9476-3526-44a6-8a19-486e005a851d-kube-api-access-zhg4n\") pod \"df8b9476-3526-44a6-8a19-486e005a851d\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.605999 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df8b9476-3526-44a6-8a19-486e005a851d-etc-machine-id\") pod \"df8b9476-3526-44a6-8a19-486e005a851d\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.606040 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-config-data\") pod \"df8b9476-3526-44a6-8a19-486e005a851d\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.606092 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-combined-ca-bundle\") pod \"df8b9476-3526-44a6-8a19-486e005a851d\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.606109 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-config-data-custom\") pod \"df8b9476-3526-44a6-8a19-486e005a851d\" (UID: \"df8b9476-3526-44a6-8a19-486e005a851d\") " Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.606239 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df8b9476-3526-44a6-8a19-486e005a851d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "df8b9476-3526-44a6-8a19-486e005a851d" (UID: "df8b9476-3526-44a6-8a19-486e005a851d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.607967 4786 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df8b9476-3526-44a6-8a19-486e005a851d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.612680 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-scripts" (OuterVolumeSpecName: "scripts") pod "df8b9476-3526-44a6-8a19-486e005a851d" (UID: "df8b9476-3526-44a6-8a19-486e005a851d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.612769 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "df8b9476-3526-44a6-8a19-486e005a851d" (UID: "df8b9476-3526-44a6-8a19-486e005a851d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.612957 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df8b9476-3526-44a6-8a19-486e005a851d-kube-api-access-zhg4n" (OuterVolumeSpecName: "kube-api-access-zhg4n") pod "df8b9476-3526-44a6-8a19-486e005a851d" (UID: "df8b9476-3526-44a6-8a19-486e005a851d"). InnerVolumeSpecName "kube-api-access-zhg4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.652220 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df8b9476-3526-44a6-8a19-486e005a851d" (UID: "df8b9476-3526-44a6-8a19-486e005a851d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.686274 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-config-data" (OuterVolumeSpecName: "config-data") pod "df8b9476-3526-44a6-8a19-486e005a851d" (UID: "df8b9476-3526-44a6-8a19-486e005a851d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.709761 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.709789 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhg4n\" (UniqueName: \"kubernetes.io/projected/df8b9476-3526-44a6-8a19-486e005a851d-kube-api-access-zhg4n\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.709800 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.709809 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.709818 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df8b9476-3526-44a6-8a19-486e005a851d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.906245 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.914365 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-sg-core-conf-yaml\") pod \"88561e6d-11c9-481c-b10c-db39b7291394\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.914443 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88561e6d-11c9-481c-b10c-db39b7291394-log-httpd\") pod \"88561e6d-11c9-481c-b10c-db39b7291394\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.914475 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-config-data\") pod \"88561e6d-11c9-481c-b10c-db39b7291394\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.914581 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88561e6d-11c9-481c-b10c-db39b7291394-run-httpd\") pod \"88561e6d-11c9-481c-b10c-db39b7291394\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.914752 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shpms\" (UniqueName: \"kubernetes.io/projected/88561e6d-11c9-481c-b10c-db39b7291394-kube-api-access-shpms\") pod \"88561e6d-11c9-481c-b10c-db39b7291394\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.914852 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-scripts\") pod \"88561e6d-11c9-481c-b10c-db39b7291394\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.914930 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-combined-ca-bundle\") pod \"88561e6d-11c9-481c-b10c-db39b7291394\" (UID: \"88561e6d-11c9-481c-b10c-db39b7291394\") " Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.915071 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88561e6d-11c9-481c-b10c-db39b7291394-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "88561e6d-11c9-481c-b10c-db39b7291394" (UID: "88561e6d-11c9-481c-b10c-db39b7291394"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.915635 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88561e6d-11c9-481c-b10c-db39b7291394-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.916524 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88561e6d-11c9-481c-b10c-db39b7291394-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "88561e6d-11c9-481c-b10c-db39b7291394" (UID: "88561e6d-11c9-481c-b10c-db39b7291394"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.918299 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88561e6d-11c9-481c-b10c-db39b7291394-kube-api-access-shpms" (OuterVolumeSpecName: "kube-api-access-shpms") pod "88561e6d-11c9-481c-b10c-db39b7291394" (UID: "88561e6d-11c9-481c-b10c-db39b7291394"). InnerVolumeSpecName "kube-api-access-shpms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.928418 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-scripts" (OuterVolumeSpecName: "scripts") pod "88561e6d-11c9-481c-b10c-db39b7291394" (UID: "88561e6d-11c9-481c-b10c-db39b7291394"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.947016 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "88561e6d-11c9-481c-b10c-db39b7291394" (UID: "88561e6d-11c9-481c-b10c-db39b7291394"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:56 crc kubenswrapper[4786]: I1002 07:01:56.997106 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88561e6d-11c9-481c-b10c-db39b7291394" (UID: "88561e6d-11c9-481c-b10c-db39b7291394"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.012476 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-config-data" (OuterVolumeSpecName: "config-data") pod "88561e6d-11c9-481c-b10c-db39b7291394" (UID: "88561e6d-11c9-481c-b10c-db39b7291394"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.022897 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.022924 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/88561e6d-11c9-481c-b10c-db39b7291394-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.022934 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shpms\" (UniqueName: \"kubernetes.io/projected/88561e6d-11c9-481c-b10c-db39b7291394-kube-api-access-shpms\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.022945 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.022954 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.022962 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/88561e6d-11c9-481c-b10c-db39b7291394-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.322064 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df8b9476-3526-44a6-8a19-486e005a851d","Type":"ContainerDied","Data":"73b5ed2d15918c88304a1976c5659070ea84a8dea1ccd56c7ceacfce4232bcd3"} Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.322120 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.322141 4786 scope.go:117] "RemoveContainer" containerID="c9f6d525107dd27850c86f77865cc9ba693077b3d6cf448bc5da12e619ee0693" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.330827 4786 generic.go:334] "Generic (PLEG): container finished" podID="bcad0a6b-e52d-463e-a961-0f081f4c0993" containerID="54323187d828070b7c871ed68d3841c79e06f6a95f1610a2663bb04b632e232e" exitCode=0 Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.330877 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hx78d" event={"ID":"bcad0a6b-e52d-463e-a961-0f081f4c0993","Type":"ContainerDied","Data":"54323187d828070b7c871ed68d3841c79e06f6a95f1610a2663bb04b632e232e"} Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.330900 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hx78d" event={"ID":"bcad0a6b-e52d-463e-a961-0f081f4c0993","Type":"ContainerStarted","Data":"3b30402d74328982c71cd81c6a81198cdfd650293bcc1c7a9d77526acd261cad"} Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.333473 4786 generic.go:334] "Generic (PLEG): container finished" podID="cd800405-6ab0-42b0-b1c3-b9667275a541" containerID="adcbf979f8f44f3e4eac40115e23bd364c08ee3e40122d1cea06ca203da73a9e" exitCode=0 Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.333527 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-77z8r" event={"ID":"cd800405-6ab0-42b0-b1c3-b9667275a541","Type":"ContainerDied","Data":"adcbf979f8f44f3e4eac40115e23bd364c08ee3e40122d1cea06ca203da73a9e"} Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.335930 4786 generic.go:334] "Generic (PLEG): container finished" podID="88561e6d-11c9-481c-b10c-db39b7291394" containerID="c37256cf1676ad2dba59e68ebb7666ddcc54aa623546fb5e0c875521cb4ce14d" exitCode=0 Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.335968 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88561e6d-11c9-481c-b10c-db39b7291394","Type":"ContainerDied","Data":"c37256cf1676ad2dba59e68ebb7666ddcc54aa623546fb5e0c875521cb4ce14d"} Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.335987 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"88561e6d-11c9-481c-b10c-db39b7291394","Type":"ContainerDied","Data":"f298318b5f484db0b50fa93831dce2155e499ab60275d11e6e8873ee1945e88c"} Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.336049 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.347905 4786 generic.go:334] "Generic (PLEG): container finished" podID="aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae" containerID="27fcc957455b933c54be6675e68f272b76e9e8f157ef3e564ef0a63d116d9a74" exitCode=0 Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.347951 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wssnr" event={"ID":"aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae","Type":"ContainerDied","Data":"27fcc957455b933c54be6675e68f272b76e9e8f157ef3e564ef0a63d116d9a74"} Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.358282 4786 scope.go:117] "RemoveContainer" containerID="c9a877d72ec64f9d9967f81dadd9a543de1950dd8ef8a9463fdfc4cb8a7a5d7e" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.394580 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.401895 4786 scope.go:117] "RemoveContainer" containerID="2911db3f9727bf76b333b196b5aaf4e2e6b2143417ac0faac92a24aa73f5c022" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.420962 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.437299 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:01:57 crc kubenswrapper[4786]: E1002 07:01:57.455813 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88561e6d-11c9-481c-b10c-db39b7291394" containerName="sg-core" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.455863 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="88561e6d-11c9-481c-b10c-db39b7291394" containerName="sg-core" Oct 02 07:01:57 crc kubenswrapper[4786]: E1002 07:01:57.455900 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88561e6d-11c9-481c-b10c-db39b7291394" containerName="ceilometer-central-agent" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.455925 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="88561e6d-11c9-481c-b10c-db39b7291394" containerName="ceilometer-central-agent" Oct 02 07:01:57 crc kubenswrapper[4786]: E1002 07:01:57.455935 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df8b9476-3526-44a6-8a19-486e005a851d" containerName="cinder-scheduler" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.455941 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="df8b9476-3526-44a6-8a19-486e005a851d" containerName="cinder-scheduler" Oct 02 07:01:57 crc kubenswrapper[4786]: E1002 07:01:57.455953 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88561e6d-11c9-481c-b10c-db39b7291394" containerName="proxy-httpd" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.455958 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="88561e6d-11c9-481c-b10c-db39b7291394" containerName="proxy-httpd" Oct 02 07:01:57 crc kubenswrapper[4786]: E1002 07:01:57.455967 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88561e6d-11c9-481c-b10c-db39b7291394" containerName="ceilometer-notification-agent" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.455973 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="88561e6d-11c9-481c-b10c-db39b7291394" containerName="ceilometer-notification-agent" Oct 02 07:01:57 crc kubenswrapper[4786]: E1002 07:01:57.456004 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df8b9476-3526-44a6-8a19-486e005a851d" containerName="probe" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.456011 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="df8b9476-3526-44a6-8a19-486e005a851d" containerName="probe" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.456235 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="df8b9476-3526-44a6-8a19-486e005a851d" containerName="probe" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.456250 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="88561e6d-11c9-481c-b10c-db39b7291394" containerName="proxy-httpd" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.456259 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="88561e6d-11c9-481c-b10c-db39b7291394" containerName="sg-core" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.456273 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="88561e6d-11c9-481c-b10c-db39b7291394" containerName="ceilometer-notification-agent" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.456282 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="88561e6d-11c9-481c-b10c-db39b7291394" containerName="ceilometer-central-agent" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.456328 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="df8b9476-3526-44a6-8a19-486e005a851d" containerName="cinder-scheduler" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.457211 4786 scope.go:117] "RemoveContainer" containerID="3584a8e1047614618d3dcf045aae727148d4fb2183bffd8846358b3674a6a180" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.457749 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.457776 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.457854 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.459484 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.460661 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.461347 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.468584 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.469982 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.471491 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.473272 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.489146 4786 scope.go:117] "RemoveContainer" containerID="c37256cf1676ad2dba59e68ebb7666ddcc54aa623546fb5e0c875521cb4ce14d" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.497131 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.497178 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.514633 4786 scope.go:117] "RemoveContainer" containerID="ae9d1b6ebd0d0c544fe25da4b98dc1e992ee9b4a8b396564e96759437f102b3e" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.531216 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.531252 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fda2a685-4713-46a8-9630-6d9d70f80bf7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fda2a685-4713-46a8-9630-6d9d70f80bf7\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.531274 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fda2a685-4713-46a8-9630-6d9d70f80bf7-scripts\") pod \"cinder-scheduler-0\" (UID: \"fda2a685-4713-46a8-9630-6d9d70f80bf7\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.531306 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.531321 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-config-data\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.531341 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-scripts\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.531362 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f2dr\" (UniqueName: \"kubernetes.io/projected/a85dbfd2-4724-469f-937c-caf0eaf135d8-kube-api-access-9f2dr\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.531376 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qrvq\" (UniqueName: \"kubernetes.io/projected/fda2a685-4713-46a8-9630-6d9d70f80bf7-kube-api-access-5qrvq\") pod \"cinder-scheduler-0\" (UID: \"fda2a685-4713-46a8-9630-6d9d70f80bf7\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.531389 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a85dbfd2-4724-469f-937c-caf0eaf135d8-run-httpd\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.531406 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fda2a685-4713-46a8-9630-6d9d70f80bf7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fda2a685-4713-46a8-9630-6d9d70f80bf7\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.531434 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a85dbfd2-4724-469f-937c-caf0eaf135d8-log-httpd\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.531448 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda2a685-4713-46a8-9630-6d9d70f80bf7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fda2a685-4713-46a8-9630-6d9d70f80bf7\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.531480 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fda2a685-4713-46a8-9630-6d9d70f80bf7-config-data\") pod \"cinder-scheduler-0\" (UID: \"fda2a685-4713-46a8-9630-6d9d70f80bf7\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.533277 4786 scope.go:117] "RemoveContainer" containerID="2911db3f9727bf76b333b196b5aaf4e2e6b2143417ac0faac92a24aa73f5c022" Oct 02 07:01:57 crc kubenswrapper[4786]: E1002 07:01:57.533710 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2911db3f9727bf76b333b196b5aaf4e2e6b2143417ac0faac92a24aa73f5c022\": container with ID starting with 2911db3f9727bf76b333b196b5aaf4e2e6b2143417ac0faac92a24aa73f5c022 not found: ID does not exist" containerID="2911db3f9727bf76b333b196b5aaf4e2e6b2143417ac0faac92a24aa73f5c022" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.533737 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2911db3f9727bf76b333b196b5aaf4e2e6b2143417ac0faac92a24aa73f5c022"} err="failed to get container status \"2911db3f9727bf76b333b196b5aaf4e2e6b2143417ac0faac92a24aa73f5c022\": rpc error: code = NotFound desc = could not find container \"2911db3f9727bf76b333b196b5aaf4e2e6b2143417ac0faac92a24aa73f5c022\": container with ID starting with 2911db3f9727bf76b333b196b5aaf4e2e6b2143417ac0faac92a24aa73f5c022 not found: ID does not exist" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.533755 4786 scope.go:117] "RemoveContainer" containerID="3584a8e1047614618d3dcf045aae727148d4fb2183bffd8846358b3674a6a180" Oct 02 07:01:57 crc kubenswrapper[4786]: E1002 07:01:57.534057 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3584a8e1047614618d3dcf045aae727148d4fb2183bffd8846358b3674a6a180\": container with ID starting with 3584a8e1047614618d3dcf045aae727148d4fb2183bffd8846358b3674a6a180 not found: ID does not exist" containerID="3584a8e1047614618d3dcf045aae727148d4fb2183bffd8846358b3674a6a180" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.534078 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3584a8e1047614618d3dcf045aae727148d4fb2183bffd8846358b3674a6a180"} err="failed to get container status \"3584a8e1047614618d3dcf045aae727148d4fb2183bffd8846358b3674a6a180\": rpc error: code = NotFound desc = could not find container \"3584a8e1047614618d3dcf045aae727148d4fb2183bffd8846358b3674a6a180\": container with ID starting with 3584a8e1047614618d3dcf045aae727148d4fb2183bffd8846358b3674a6a180 not found: ID does not exist" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.534090 4786 scope.go:117] "RemoveContainer" containerID="c37256cf1676ad2dba59e68ebb7666ddcc54aa623546fb5e0c875521cb4ce14d" Oct 02 07:01:57 crc kubenswrapper[4786]: E1002 07:01:57.534370 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c37256cf1676ad2dba59e68ebb7666ddcc54aa623546fb5e0c875521cb4ce14d\": container with ID starting with c37256cf1676ad2dba59e68ebb7666ddcc54aa623546fb5e0c875521cb4ce14d not found: ID does not exist" containerID="c37256cf1676ad2dba59e68ebb7666ddcc54aa623546fb5e0c875521cb4ce14d" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.534389 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37256cf1676ad2dba59e68ebb7666ddcc54aa623546fb5e0c875521cb4ce14d"} err="failed to get container status \"c37256cf1676ad2dba59e68ebb7666ddcc54aa623546fb5e0c875521cb4ce14d\": rpc error: code = NotFound desc = could not find container \"c37256cf1676ad2dba59e68ebb7666ddcc54aa623546fb5e0c875521cb4ce14d\": container with ID starting with c37256cf1676ad2dba59e68ebb7666ddcc54aa623546fb5e0c875521cb4ce14d not found: ID does not exist" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.534401 4786 scope.go:117] "RemoveContainer" containerID="ae9d1b6ebd0d0c544fe25da4b98dc1e992ee9b4a8b396564e96759437f102b3e" Oct 02 07:01:57 crc kubenswrapper[4786]: E1002 07:01:57.534608 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae9d1b6ebd0d0c544fe25da4b98dc1e992ee9b4a8b396564e96759437f102b3e\": container with ID starting with ae9d1b6ebd0d0c544fe25da4b98dc1e992ee9b4a8b396564e96759437f102b3e not found: ID does not exist" containerID="ae9d1b6ebd0d0c544fe25da4b98dc1e992ee9b4a8b396564e96759437f102b3e" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.534625 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae9d1b6ebd0d0c544fe25da4b98dc1e992ee9b4a8b396564e96759437f102b3e"} err="failed to get container status \"ae9d1b6ebd0d0c544fe25da4b98dc1e992ee9b4a8b396564e96759437f102b3e\": rpc error: code = NotFound desc = could not find container \"ae9d1b6ebd0d0c544fe25da4b98dc1e992ee9b4a8b396564e96759437f102b3e\": container with ID starting with ae9d1b6ebd0d0c544fe25da4b98dc1e992ee9b4a8b396564e96759437f102b3e not found: ID does not exist" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.614217 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5c985b58cc-mrm2m" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.632381 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fda2a685-4713-46a8-9630-6d9d70f80bf7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fda2a685-4713-46a8-9630-6d9d70f80bf7\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.632414 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fda2a685-4713-46a8-9630-6d9d70f80bf7-scripts\") pod \"cinder-scheduler-0\" (UID: \"fda2a685-4713-46a8-9630-6d9d70f80bf7\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.632459 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.632475 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-config-data\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.632513 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-scripts\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.632543 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f2dr\" (UniqueName: \"kubernetes.io/projected/a85dbfd2-4724-469f-937c-caf0eaf135d8-kube-api-access-9f2dr\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.632556 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qrvq\" (UniqueName: \"kubernetes.io/projected/fda2a685-4713-46a8-9630-6d9d70f80bf7-kube-api-access-5qrvq\") pod \"cinder-scheduler-0\" (UID: \"fda2a685-4713-46a8-9630-6d9d70f80bf7\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.632570 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a85dbfd2-4724-469f-937c-caf0eaf135d8-run-httpd\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.632603 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fda2a685-4713-46a8-9630-6d9d70f80bf7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fda2a685-4713-46a8-9630-6d9d70f80bf7\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.632634 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a85dbfd2-4724-469f-937c-caf0eaf135d8-log-httpd\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.632648 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda2a685-4713-46a8-9630-6d9d70f80bf7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fda2a685-4713-46a8-9630-6d9d70f80bf7\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.632680 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fda2a685-4713-46a8-9630-6d9d70f80bf7-config-data\") pod \"cinder-scheduler-0\" (UID: \"fda2a685-4713-46a8-9630-6d9d70f80bf7\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.632761 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.640313 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fda2a685-4713-46a8-9630-6d9d70f80bf7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fda2a685-4713-46a8-9630-6d9d70f80bf7\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.641145 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a85dbfd2-4724-469f-937c-caf0eaf135d8-run-httpd\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.641202 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fda2a685-4713-46a8-9630-6d9d70f80bf7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fda2a685-4713-46a8-9630-6d9d70f80bf7\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.641368 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a85dbfd2-4724-469f-937c-caf0eaf135d8-log-httpd\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.641847 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.659965 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fda2a685-4713-46a8-9630-6d9d70f80bf7-config-data\") pod \"cinder-scheduler-0\" (UID: \"fda2a685-4713-46a8-9630-6d9d70f80bf7\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.662459 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-config-data\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.663793 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.663996 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qrvq\" (UniqueName: \"kubernetes.io/projected/fda2a685-4713-46a8-9630-6d9d70f80bf7-kube-api-access-5qrvq\") pod \"cinder-scheduler-0\" (UID: \"fda2a685-4713-46a8-9630-6d9d70f80bf7\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.664961 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fda2a685-4713-46a8-9630-6d9d70f80bf7-scripts\") pod \"cinder-scheduler-0\" (UID: \"fda2a685-4713-46a8-9630-6d9d70f80bf7\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.670455 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-scripts\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.673232 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-87d5477fb-vgjcn"] Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.673467 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-87d5477fb-vgjcn" podUID="f320f12a-3d58-4c69-8191-29399923abe2" containerName="neutron-api" containerID="cri-o://4e38365225cee24aff4e7e67e68d3ea58798295979f740a7bf6116aa24a91b66" gracePeriod=30 Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.673630 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-87d5477fb-vgjcn" podUID="f320f12a-3d58-4c69-8191-29399923abe2" containerName="neutron-httpd" containerID="cri-o://3d07d32eb08fa90802051e7d312bcebf5b44e7daf7802d97feb236249b2c67ef" gracePeriod=30 Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.674524 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda2a685-4713-46a8-9630-6d9d70f80bf7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fda2a685-4713-46a8-9630-6d9d70f80bf7\") " pod="openstack/cinder-scheduler-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.680840 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f2dr\" (UniqueName: \"kubernetes.io/projected/a85dbfd2-4724-469f-937c-caf0eaf135d8-kube-api-access-9f2dr\") pod \"ceilometer-0\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.788786 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:01:57 crc kubenswrapper[4786]: I1002 07:01:57.795167 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 07:01:58 crc kubenswrapper[4786]: I1002 07:01:58.189078 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88561e6d-11c9-481c-b10c-db39b7291394" path="/var/lib/kubelet/pods/88561e6d-11c9-481c-b10c-db39b7291394/volumes" Oct 02 07:01:58 crc kubenswrapper[4786]: I1002 07:01:58.190252 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df8b9476-3526-44a6-8a19-486e005a851d" path="/var/lib/kubelet/pods/df8b9476-3526-44a6-8a19-486e005a851d/volumes" Oct 02 07:01:58 crc kubenswrapper[4786]: I1002 07:01:58.248489 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 07:01:58 crc kubenswrapper[4786]: I1002 07:01:58.323039 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:01:58 crc kubenswrapper[4786]: W1002 07:01:58.334249 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda85dbfd2_4724_469f_937c_caf0eaf135d8.slice/crio-51f7f7c0b39c138dea3fc2d04aefe4cc8ccb878c7c789735ec43132d98e91c8f WatchSource:0}: Error finding container 51f7f7c0b39c138dea3fc2d04aefe4cc8ccb878c7c789735ec43132d98e91c8f: Status 404 returned error can't find the container with id 51f7f7c0b39c138dea3fc2d04aefe4cc8ccb878c7c789735ec43132d98e91c8f Oct 02 07:01:58 crc kubenswrapper[4786]: I1002 07:01:58.373056 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fda2a685-4713-46a8-9630-6d9d70f80bf7","Type":"ContainerStarted","Data":"170d5eae12e99b14abd079047c0fba53fae101cd7887c41c4b54e7aff750ce08"} Oct 02 07:01:58 crc kubenswrapper[4786]: I1002 07:01:58.375530 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a85dbfd2-4724-469f-937c-caf0eaf135d8","Type":"ContainerStarted","Data":"51f7f7c0b39c138dea3fc2d04aefe4cc8ccb878c7c789735ec43132d98e91c8f"} Oct 02 07:01:58 crc kubenswrapper[4786]: I1002 07:01:58.377754 4786 generic.go:334] "Generic (PLEG): container finished" podID="f320f12a-3d58-4c69-8191-29399923abe2" containerID="3d07d32eb08fa90802051e7d312bcebf5b44e7daf7802d97feb236249b2c67ef" exitCode=0 Oct 02 07:01:58 crc kubenswrapper[4786]: I1002 07:01:58.377801 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-87d5477fb-vgjcn" event={"ID":"f320f12a-3d58-4c69-8191-29399923abe2","Type":"ContainerDied","Data":"3d07d32eb08fa90802051e7d312bcebf5b44e7daf7802d97feb236249b2c67ef"} Oct 02 07:01:59 crc kubenswrapper[4786]: I1002 07:01:59.389448 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fda2a685-4713-46a8-9630-6d9d70f80bf7","Type":"ContainerStarted","Data":"3f068dfc211bb97f2934cdda21dd9be8a40a0865f5608176a693f0f6d0904b28"} Oct 02 07:02:01 crc kubenswrapper[4786]: I1002 07:02:01.406002 4786 generic.go:334] "Generic (PLEG): container finished" podID="f320f12a-3d58-4c69-8191-29399923abe2" containerID="4e38365225cee24aff4e7e67e68d3ea58798295979f740a7bf6116aa24a91b66" exitCode=0 Oct 02 07:02:01 crc kubenswrapper[4786]: I1002 07:02:01.406064 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-87d5477fb-vgjcn" event={"ID":"f320f12a-3d58-4c69-8191-29399923abe2","Type":"ContainerDied","Data":"4e38365225cee24aff4e7e67e68d3ea58798295979f740a7bf6116aa24a91b66"} Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.637232 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wssnr" Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.679969 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hx78d" Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.698102 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-77z8r" Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.722349 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql54v\" (UniqueName: \"kubernetes.io/projected/aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae-kube-api-access-ql54v\") pod \"aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae\" (UID: \"aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae\") " Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.729894 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae-kube-api-access-ql54v" (OuterVolumeSpecName: "kube-api-access-ql54v") pod "aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae" (UID: "aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae"). InnerVolumeSpecName "kube-api-access-ql54v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.823661 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6tnt\" (UniqueName: \"kubernetes.io/projected/cd800405-6ab0-42b0-b1c3-b9667275a541-kube-api-access-t6tnt\") pod \"cd800405-6ab0-42b0-b1c3-b9667275a541\" (UID: \"cd800405-6ab0-42b0-b1c3-b9667275a541\") " Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.823950 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g7mf\" (UniqueName: \"kubernetes.io/projected/bcad0a6b-e52d-463e-a961-0f081f4c0993-kube-api-access-4g7mf\") pod \"bcad0a6b-e52d-463e-a961-0f081f4c0993\" (UID: \"bcad0a6b-e52d-463e-a961-0f081f4c0993\") " Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.824423 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql54v\" (UniqueName: \"kubernetes.io/projected/aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae-kube-api-access-ql54v\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.829769 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcad0a6b-e52d-463e-a961-0f081f4c0993-kube-api-access-4g7mf" (OuterVolumeSpecName: "kube-api-access-4g7mf") pod "bcad0a6b-e52d-463e-a961-0f081f4c0993" (UID: "bcad0a6b-e52d-463e-a961-0f081f4c0993"). InnerVolumeSpecName "kube-api-access-4g7mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.830120 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd800405-6ab0-42b0-b1c3-b9667275a541-kube-api-access-t6tnt" (OuterVolumeSpecName: "kube-api-access-t6tnt") pod "cd800405-6ab0-42b0-b1c3-b9667275a541" (UID: "cd800405-6ab0-42b0-b1c3-b9667275a541"). InnerVolumeSpecName "kube-api-access-t6tnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.850886 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.924969 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-combined-ca-bundle\") pod \"f320f12a-3d58-4c69-8191-29399923abe2\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.925091 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgc5g\" (UniqueName: \"kubernetes.io/projected/f320f12a-3d58-4c69-8191-29399923abe2-kube-api-access-hgc5g\") pod \"f320f12a-3d58-4c69-8191-29399923abe2\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.925115 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-ovndb-tls-certs\") pod \"f320f12a-3d58-4c69-8191-29399923abe2\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.925144 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-config\") pod \"f320f12a-3d58-4c69-8191-29399923abe2\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.925190 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-httpd-config\") pod \"f320f12a-3d58-4c69-8191-29399923abe2\" (UID: \"f320f12a-3d58-4c69-8191-29399923abe2\") " Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.925548 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6tnt\" (UniqueName: \"kubernetes.io/projected/cd800405-6ab0-42b0-b1c3-b9667275a541-kube-api-access-t6tnt\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.925564 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g7mf\" (UniqueName: \"kubernetes.io/projected/bcad0a6b-e52d-463e-a961-0f081f4c0993-kube-api-access-4g7mf\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.928854 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f320f12a-3d58-4c69-8191-29399923abe2" (UID: "f320f12a-3d58-4c69-8191-29399923abe2"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.932849 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f320f12a-3d58-4c69-8191-29399923abe2-kube-api-access-hgc5g" (OuterVolumeSpecName: "kube-api-access-hgc5g") pod "f320f12a-3d58-4c69-8191-29399923abe2" (UID: "f320f12a-3d58-4c69-8191-29399923abe2"). InnerVolumeSpecName "kube-api-access-hgc5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.963088 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f320f12a-3d58-4c69-8191-29399923abe2" (UID: "f320f12a-3d58-4c69-8191-29399923abe2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.966795 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-config" (OuterVolumeSpecName: "config") pod "f320f12a-3d58-4c69-8191-29399923abe2" (UID: "f320f12a-3d58-4c69-8191-29399923abe2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:02 crc kubenswrapper[4786]: I1002 07:02:02.983446 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f320f12a-3d58-4c69-8191-29399923abe2" (UID: "f320f12a-3d58-4c69-8191-29399923abe2"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.026775 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgc5g\" (UniqueName: \"kubernetes.io/projected/f320f12a-3d58-4c69-8191-29399923abe2-kube-api-access-hgc5g\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.026809 4786 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.026819 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.026827 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.026837 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f320f12a-3d58-4c69-8191-29399923abe2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.421807 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f57411f4-01bc-4ecc-a240-9101e861f97d","Type":"ContainerStarted","Data":"34753bc1f504672649819e26ce1e633c6b08fc0559976296ccdf68482df4decb"} Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.424618 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hx78d" event={"ID":"bcad0a6b-e52d-463e-a961-0f081f4c0993","Type":"ContainerDied","Data":"3b30402d74328982c71cd81c6a81198cdfd650293bcc1c7a9d77526acd261cad"} Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.424644 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hx78d" Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.424659 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b30402d74328982c71cd81c6a81198cdfd650293bcc1c7a9d77526acd261cad" Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.425774 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-77z8r" event={"ID":"cd800405-6ab0-42b0-b1c3-b9667275a541","Type":"ContainerDied","Data":"dfb562197a3d7dd15e83746d785a1295c829abf201fa4e2041d75bdcda91e622"} Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.425787 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-77z8r" Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.425800 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfb562197a3d7dd15e83746d785a1295c829abf201fa4e2041d75bdcda91e622" Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.427043 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fda2a685-4713-46a8-9630-6d9d70f80bf7","Type":"ContainerStarted","Data":"25d1e192bb92ee30d348baa004b0f7beba1bc355495455ae57a1f4afaaf7182c"} Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.428177 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wssnr" event={"ID":"aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae","Type":"ContainerDied","Data":"37b0e396cfd2eb2d43dd865e30b86d81f744fbd8d0713d2741c4224d3c484ddb"} Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.428212 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37b0e396cfd2eb2d43dd865e30b86d81f744fbd8d0713d2741c4224d3c484ddb" Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.428264 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wssnr" Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.439224 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.55655634 podStartE2EDuration="13.439212265s" podCreationTimestamp="2025-10-02 07:01:50 +0000 UTC" firstStartedPulling="2025-10-02 07:01:51.660564187 +0000 UTC m=+921.781747318" lastFinishedPulling="2025-10-02 07:02:02.543220111 +0000 UTC m=+932.664403243" observedRunningTime="2025-10-02 07:02:03.433831384 +0000 UTC m=+933.555014535" watchObservedRunningTime="2025-10-02 07:02:03.439212265 +0000 UTC m=+933.560395396" Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.439880 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a85dbfd2-4724-469f-937c-caf0eaf135d8","Type":"ContainerStarted","Data":"3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16"} Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.444677 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-87d5477fb-vgjcn" event={"ID":"f320f12a-3d58-4c69-8191-29399923abe2","Type":"ContainerDied","Data":"f6e65a19cd209ffa3a65af8dd0d0a665eea90a6c407b8afb8bc15c99ebc1c585"} Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.444733 4786 scope.go:117] "RemoveContainer" containerID="3d07d32eb08fa90802051e7d312bcebf5b44e7daf7802d97feb236249b2c67ef" Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.444831 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-87d5477fb-vgjcn" Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.470803 4786 scope.go:117] "RemoveContainer" containerID="4e38365225cee24aff4e7e67e68d3ea58798295979f740a7bf6116aa24a91b66" Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.616271 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.616251478 podStartE2EDuration="6.616251478s" podCreationTimestamp="2025-10-02 07:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:02:03.449996422 +0000 UTC m=+933.571179564" watchObservedRunningTime="2025-10-02 07:02:03.616251478 +0000 UTC m=+933.737434610" Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.620759 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-87d5477fb-vgjcn"] Oct 02 07:02:03 crc kubenswrapper[4786]: I1002 07:02:03.626562 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-87d5477fb-vgjcn"] Oct 02 07:02:04 crc kubenswrapper[4786]: I1002 07:02:04.187367 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f320f12a-3d58-4c69-8191-29399923abe2" path="/var/lib/kubelet/pods/f320f12a-3d58-4c69-8191-29399923abe2/volumes" Oct 02 07:02:04 crc kubenswrapper[4786]: I1002 07:02:04.452067 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a85dbfd2-4724-469f-937c-caf0eaf135d8","Type":"ContainerStarted","Data":"e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f"} Oct 02 07:02:04 crc kubenswrapper[4786]: I1002 07:02:04.452113 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a85dbfd2-4724-469f-937c-caf0eaf135d8","Type":"ContainerStarted","Data":"79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054"} Oct 02 07:02:04 crc kubenswrapper[4786]: I1002 07:02:04.612465 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:02:04 crc kubenswrapper[4786]: I1002 07:02:04.859124 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:02:04 crc kubenswrapper[4786]: I1002 07:02:04.861013 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7dd886698c-blkwg" Oct 02 07:02:06 crc kubenswrapper[4786]: I1002 07:02:06.469602 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a85dbfd2-4724-469f-937c-caf0eaf135d8","Type":"ContainerStarted","Data":"b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95"} Oct 02 07:02:06 crc kubenswrapper[4786]: I1002 07:02:06.469931 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 07:02:06 crc kubenswrapper[4786]: I1002 07:02:06.469833 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerName="proxy-httpd" containerID="cri-o://b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95" gracePeriod=30 Oct 02 07:02:06 crc kubenswrapper[4786]: I1002 07:02:06.469746 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerName="ceilometer-central-agent" containerID="cri-o://3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16" gracePeriod=30 Oct 02 07:02:06 crc kubenswrapper[4786]: I1002 07:02:06.469882 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerName="ceilometer-notification-agent" containerID="cri-o://79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054" gracePeriod=30 Oct 02 07:02:06 crc kubenswrapper[4786]: I1002 07:02:06.469896 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerName="sg-core" containerID="cri-o://e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f" gracePeriod=30 Oct 02 07:02:06 crc kubenswrapper[4786]: I1002 07:02:06.490656 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.349645151 podStartE2EDuration="9.490645101s" podCreationTimestamp="2025-10-02 07:01:57 +0000 UTC" firstStartedPulling="2025-10-02 07:01:58.33793801 +0000 UTC m=+928.459121142" lastFinishedPulling="2025-10-02 07:02:05.47893796 +0000 UTC m=+935.600121092" observedRunningTime="2025-10-02 07:02:06.484081599 +0000 UTC m=+936.605264740" watchObservedRunningTime="2025-10-02 07:02:06.490645101 +0000 UTC m=+936.611828231" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.043132 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.206239 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a85dbfd2-4724-469f-937c-caf0eaf135d8-run-httpd\") pod \"a85dbfd2-4724-469f-937c-caf0eaf135d8\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.206343 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-scripts\") pod \"a85dbfd2-4724-469f-937c-caf0eaf135d8\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.206607 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a85dbfd2-4724-469f-937c-caf0eaf135d8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a85dbfd2-4724-469f-937c-caf0eaf135d8" (UID: "a85dbfd2-4724-469f-937c-caf0eaf135d8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.207464 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-sg-core-conf-yaml\") pod \"a85dbfd2-4724-469f-937c-caf0eaf135d8\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.207544 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-combined-ca-bundle\") pod \"a85dbfd2-4724-469f-937c-caf0eaf135d8\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.207585 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a85dbfd2-4724-469f-937c-caf0eaf135d8-log-httpd\") pod \"a85dbfd2-4724-469f-937c-caf0eaf135d8\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.207605 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f2dr\" (UniqueName: \"kubernetes.io/projected/a85dbfd2-4724-469f-937c-caf0eaf135d8-kube-api-access-9f2dr\") pod \"a85dbfd2-4724-469f-937c-caf0eaf135d8\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.207730 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-config-data\") pod \"a85dbfd2-4724-469f-937c-caf0eaf135d8\" (UID: \"a85dbfd2-4724-469f-937c-caf0eaf135d8\") " Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.207906 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a85dbfd2-4724-469f-937c-caf0eaf135d8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a85dbfd2-4724-469f-937c-caf0eaf135d8" (UID: "a85dbfd2-4724-469f-937c-caf0eaf135d8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.208195 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a85dbfd2-4724-469f-937c-caf0eaf135d8-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.208211 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a85dbfd2-4724-469f-937c-caf0eaf135d8-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.211327 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-scripts" (OuterVolumeSpecName: "scripts") pod "a85dbfd2-4724-469f-937c-caf0eaf135d8" (UID: "a85dbfd2-4724-469f-937c-caf0eaf135d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.211748 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a85dbfd2-4724-469f-937c-caf0eaf135d8-kube-api-access-9f2dr" (OuterVolumeSpecName: "kube-api-access-9f2dr") pod "a85dbfd2-4724-469f-937c-caf0eaf135d8" (UID: "a85dbfd2-4724-469f-937c-caf0eaf135d8"). InnerVolumeSpecName "kube-api-access-9f2dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.226854 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a85dbfd2-4724-469f-937c-caf0eaf135d8" (UID: "a85dbfd2-4724-469f-937c-caf0eaf135d8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.255240 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a85dbfd2-4724-469f-937c-caf0eaf135d8" (UID: "a85dbfd2-4724-469f-937c-caf0eaf135d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.274567 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-config-data" (OuterVolumeSpecName: "config-data") pod "a85dbfd2-4724-469f-937c-caf0eaf135d8" (UID: "a85dbfd2-4724-469f-937c-caf0eaf135d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.309556 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.309580 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.309589 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.309600 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a85dbfd2-4724-469f-937c-caf0eaf135d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.309608 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f2dr\" (UniqueName: \"kubernetes.io/projected/a85dbfd2-4724-469f-937c-caf0eaf135d8-kube-api-access-9f2dr\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.478576 4786 generic.go:334] "Generic (PLEG): container finished" podID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerID="b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95" exitCode=0 Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.478607 4786 generic.go:334] "Generic (PLEG): container finished" podID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerID="e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f" exitCode=2 Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.478616 4786 generic.go:334] "Generic (PLEG): container finished" podID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerID="79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054" exitCode=0 Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.478623 4786 generic.go:334] "Generic (PLEG): container finished" podID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerID="3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16" exitCode=0 Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.478643 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.478643 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a85dbfd2-4724-469f-937c-caf0eaf135d8","Type":"ContainerDied","Data":"b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95"} Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.479201 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a85dbfd2-4724-469f-937c-caf0eaf135d8","Type":"ContainerDied","Data":"e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f"} Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.479214 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a85dbfd2-4724-469f-937c-caf0eaf135d8","Type":"ContainerDied","Data":"79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054"} Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.479223 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a85dbfd2-4724-469f-937c-caf0eaf135d8","Type":"ContainerDied","Data":"3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16"} Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.479232 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a85dbfd2-4724-469f-937c-caf0eaf135d8","Type":"ContainerDied","Data":"51f7f7c0b39c138dea3fc2d04aefe4cc8ccb878c7c789735ec43132d98e91c8f"} Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.479247 4786 scope.go:117] "RemoveContainer" containerID="b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.501135 4786 scope.go:117] "RemoveContainer" containerID="e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.507359 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.513191 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.517553 4786 scope.go:117] "RemoveContainer" containerID="79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.522481 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:02:07 crc kubenswrapper[4786]: E1002 07:02:07.522820 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcad0a6b-e52d-463e-a961-0f081f4c0993" containerName="mariadb-database-create" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.522837 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcad0a6b-e52d-463e-a961-0f081f4c0993" containerName="mariadb-database-create" Oct 02 07:02:07 crc kubenswrapper[4786]: E1002 07:02:07.522855 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd800405-6ab0-42b0-b1c3-b9667275a541" containerName="mariadb-database-create" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.522862 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd800405-6ab0-42b0-b1c3-b9667275a541" containerName="mariadb-database-create" Oct 02 07:02:07 crc kubenswrapper[4786]: E1002 07:02:07.522870 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerName="sg-core" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.522875 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerName="sg-core" Oct 02 07:02:07 crc kubenswrapper[4786]: E1002 07:02:07.522890 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f320f12a-3d58-4c69-8191-29399923abe2" containerName="neutron-httpd" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.522897 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f320f12a-3d58-4c69-8191-29399923abe2" containerName="neutron-httpd" Oct 02 07:02:07 crc kubenswrapper[4786]: E1002 07:02:07.522919 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerName="ceilometer-central-agent" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.522924 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerName="ceilometer-central-agent" Oct 02 07:02:07 crc kubenswrapper[4786]: E1002 07:02:07.522934 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae" containerName="mariadb-database-create" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.522940 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae" containerName="mariadb-database-create" Oct 02 07:02:07 crc kubenswrapper[4786]: E1002 07:02:07.522948 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerName="ceilometer-notification-agent" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.522953 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerName="ceilometer-notification-agent" Oct 02 07:02:07 crc kubenswrapper[4786]: E1002 07:02:07.522962 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerName="proxy-httpd" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.522967 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerName="proxy-httpd" Oct 02 07:02:07 crc kubenswrapper[4786]: E1002 07:02:07.522977 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f320f12a-3d58-4c69-8191-29399923abe2" containerName="neutron-api" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.522982 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f320f12a-3d58-4c69-8191-29399923abe2" containerName="neutron-api" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.523147 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f320f12a-3d58-4c69-8191-29399923abe2" containerName="neutron-api" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.523164 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerName="sg-core" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.523174 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcad0a6b-e52d-463e-a961-0f081f4c0993" containerName="mariadb-database-create" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.523185 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerName="ceilometer-central-agent" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.523192 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd800405-6ab0-42b0-b1c3-b9667275a541" containerName="mariadb-database-create" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.523200 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f320f12a-3d58-4c69-8191-29399923abe2" containerName="neutron-httpd" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.523209 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerName="proxy-httpd" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.523218 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a85dbfd2-4724-469f-937c-caf0eaf135d8" containerName="ceilometer-notification-agent" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.523224 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae" containerName="mariadb-database-create" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.526797 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.528789 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.528982 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.533043 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.534773 4786 scope.go:117] "RemoveContainer" containerID="3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.560712 4786 scope.go:117] "RemoveContainer" containerID="b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95" Oct 02 07:02:07 crc kubenswrapper[4786]: E1002 07:02:07.561060 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95\": container with ID starting with b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95 not found: ID does not exist" containerID="b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.561091 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95"} err="failed to get container status \"b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95\": rpc error: code = NotFound desc = could not find container \"b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95\": container with ID starting with b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95 not found: ID does not exist" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.561111 4786 scope.go:117] "RemoveContainer" containerID="e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f" Oct 02 07:02:07 crc kubenswrapper[4786]: E1002 07:02:07.561355 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f\": container with ID starting with e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f not found: ID does not exist" containerID="e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.561375 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f"} err="failed to get container status \"e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f\": rpc error: code = NotFound desc = could not find container \"e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f\": container with ID starting with e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f not found: ID does not exist" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.561389 4786 scope.go:117] "RemoveContainer" containerID="79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054" Oct 02 07:02:07 crc kubenswrapper[4786]: E1002 07:02:07.561800 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054\": container with ID starting with 79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054 not found: ID does not exist" containerID="79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.561828 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054"} err="failed to get container status \"79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054\": rpc error: code = NotFound desc = could not find container \"79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054\": container with ID starting with 79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054 not found: ID does not exist" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.561852 4786 scope.go:117] "RemoveContainer" containerID="3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16" Oct 02 07:02:07 crc kubenswrapper[4786]: E1002 07:02:07.562116 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16\": container with ID starting with 3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16 not found: ID does not exist" containerID="3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.562141 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16"} err="failed to get container status \"3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16\": rpc error: code = NotFound desc = could not find container \"3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16\": container with ID starting with 3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16 not found: ID does not exist" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.562154 4786 scope.go:117] "RemoveContainer" containerID="b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.562373 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95"} err="failed to get container status \"b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95\": rpc error: code = NotFound desc = could not find container \"b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95\": container with ID starting with b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95 not found: ID does not exist" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.562408 4786 scope.go:117] "RemoveContainer" containerID="e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.562710 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f"} err="failed to get container status \"e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f\": rpc error: code = NotFound desc = could not find container \"e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f\": container with ID starting with e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f not found: ID does not exist" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.562731 4786 scope.go:117] "RemoveContainer" containerID="79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.562973 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054"} err="failed to get container status \"79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054\": rpc error: code = NotFound desc = could not find container \"79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054\": container with ID starting with 79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054 not found: ID does not exist" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.562996 4786 scope.go:117] "RemoveContainer" containerID="3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.563231 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16"} err="failed to get container status \"3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16\": rpc error: code = NotFound desc = could not find container \"3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16\": container with ID starting with 3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16 not found: ID does not exist" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.563264 4786 scope.go:117] "RemoveContainer" containerID="b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.563510 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95"} err="failed to get container status \"b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95\": rpc error: code = NotFound desc = could not find container \"b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95\": container with ID starting with b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95 not found: ID does not exist" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.563554 4786 scope.go:117] "RemoveContainer" containerID="e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.563805 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f"} err="failed to get container status \"e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f\": rpc error: code = NotFound desc = could not find container \"e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f\": container with ID starting with e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f not found: ID does not exist" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.563826 4786 scope.go:117] "RemoveContainer" containerID="79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.564000 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054"} err="failed to get container status \"79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054\": rpc error: code = NotFound desc = could not find container \"79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054\": container with ID starting with 79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054 not found: ID does not exist" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.564021 4786 scope.go:117] "RemoveContainer" containerID="3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.564271 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16"} err="failed to get container status \"3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16\": rpc error: code = NotFound desc = could not find container \"3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16\": container with ID starting with 3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16 not found: ID does not exist" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.564288 4786 scope.go:117] "RemoveContainer" containerID="b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.564493 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95"} err="failed to get container status \"b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95\": rpc error: code = NotFound desc = could not find container \"b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95\": container with ID starting with b8d0e0d8f8f63570096e4e2ed7febe69849b448df8819ba2066569b38db64f95 not found: ID does not exist" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.564541 4786 scope.go:117] "RemoveContainer" containerID="e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.564719 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f"} err="failed to get container status \"e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f\": rpc error: code = NotFound desc = could not find container \"e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f\": container with ID starting with e0fbf0ca4164047913c8837f833cca9447420ab795dc1c2cf07b95e9c796101f not found: ID does not exist" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.564739 4786 scope.go:117] "RemoveContainer" containerID="79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.564926 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054"} err="failed to get container status \"79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054\": rpc error: code = NotFound desc = could not find container \"79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054\": container with ID starting with 79e29c5bc173ceacc7a42de014f21851d092493b90cf1cc8fc15737721356054 not found: ID does not exist" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.564941 4786 scope.go:117] "RemoveContainer" containerID="3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.565204 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16"} err="failed to get container status \"3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16\": rpc error: code = NotFound desc = could not find container \"3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16\": container with ID starting with 3c17e1b381f66238cdacd94cecd6d618f39fdc4dff0a2d1396c84edea6b7ec16 not found: ID does not exist" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.715002 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.715251 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-scripts\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.715394 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bef45940-0556-4ff7-b16c-bdab174e7f51-log-httpd\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.715421 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjqff\" (UniqueName: \"kubernetes.io/projected/bef45940-0556-4ff7-b16c-bdab174e7f51-kube-api-access-qjqff\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.715480 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bef45940-0556-4ff7-b16c-bdab174e7f51-run-httpd\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.715506 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-config-data\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.715554 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.795672 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.816496 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bef45940-0556-4ff7-b16c-bdab174e7f51-log-httpd\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.816543 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjqff\" (UniqueName: \"kubernetes.io/projected/bef45940-0556-4ff7-b16c-bdab174e7f51-kube-api-access-qjqff\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.816586 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bef45940-0556-4ff7-b16c-bdab174e7f51-run-httpd\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.816604 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-config-data\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.816647 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.816711 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.816730 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-scripts\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.816892 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bef45940-0556-4ff7-b16c-bdab174e7f51-log-httpd\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.816984 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bef45940-0556-4ff7-b16c-bdab174e7f51-run-httpd\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.819728 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.820171 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.820645 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-scripts\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.829036 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-config-data\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.830569 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjqff\" (UniqueName: \"kubernetes.io/projected/bef45940-0556-4ff7-b16c-bdab174e7f51-kube-api-access-qjqff\") pod \"ceilometer-0\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.843959 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:02:07 crc kubenswrapper[4786]: I1002 07:02:07.954031 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 02 07:02:08 crc kubenswrapper[4786]: I1002 07:02:08.187121 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a85dbfd2-4724-469f-937c-caf0eaf135d8" path="/var/lib/kubelet/pods/a85dbfd2-4724-469f-937c-caf0eaf135d8/volumes" Oct 02 07:02:08 crc kubenswrapper[4786]: I1002 07:02:08.258730 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:02:08 crc kubenswrapper[4786]: I1002 07:02:08.488021 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bef45940-0556-4ff7-b16c-bdab174e7f51","Type":"ContainerStarted","Data":"badaff859dbabf262ab6c7ed28a1eb6c0c902e83d37bad543f518cf4af47ecbb"} Oct 02 07:02:09 crc kubenswrapper[4786]: I1002 07:02:09.494131 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bef45940-0556-4ff7-b16c-bdab174e7f51","Type":"ContainerStarted","Data":"f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455"} Oct 02 07:02:09 crc kubenswrapper[4786]: I1002 07:02:09.625877 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:02:10 crc kubenswrapper[4786]: I1002 07:02:10.501332 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bef45940-0556-4ff7-b16c-bdab174e7f51","Type":"ContainerStarted","Data":"af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b"} Oct 02 07:02:11 crc kubenswrapper[4786]: I1002 07:02:11.508965 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bef45940-0556-4ff7-b16c-bdab174e7f51","Type":"ContainerStarted","Data":"13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5"} Oct 02 07:02:12 crc kubenswrapper[4786]: I1002 07:02:12.517982 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bef45940-0556-4ff7-b16c-bdab174e7f51","Type":"ContainerStarted","Data":"25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4"} Oct 02 07:02:12 crc kubenswrapper[4786]: I1002 07:02:12.518129 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerName="ceilometer-central-agent" containerID="cri-o://f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455" gracePeriod=30 Oct 02 07:02:12 crc kubenswrapper[4786]: I1002 07:02:12.518265 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 07:02:12 crc kubenswrapper[4786]: I1002 07:02:12.518304 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerName="sg-core" containerID="cri-o://13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5" gracePeriod=30 Oct 02 07:02:12 crc kubenswrapper[4786]: I1002 07:02:12.518346 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerName="proxy-httpd" containerID="cri-o://25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4" gracePeriod=30 Oct 02 07:02:12 crc kubenswrapper[4786]: I1002 07:02:12.518364 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerName="ceilometer-notification-agent" containerID="cri-o://af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b" gracePeriod=30 Oct 02 07:02:12 crc kubenswrapper[4786]: I1002 07:02:12.537538 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.025697764 podStartE2EDuration="5.537528541s" podCreationTimestamp="2025-10-02 07:02:07 +0000 UTC" firstStartedPulling="2025-10-02 07:02:08.263868057 +0000 UTC m=+938.385051188" lastFinishedPulling="2025-10-02 07:02:11.775698834 +0000 UTC m=+941.896881965" observedRunningTime="2025-10-02 07:02:12.532389605 +0000 UTC m=+942.653572746" watchObservedRunningTime="2025-10-02 07:02:12.537528541 +0000 UTC m=+942.658711672" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.408651 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.499270 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-combined-ca-bundle\") pod \"bef45940-0556-4ff7-b16c-bdab174e7f51\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.499493 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bef45940-0556-4ff7-b16c-bdab174e7f51-run-httpd\") pod \"bef45940-0556-4ff7-b16c-bdab174e7f51\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.499623 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-config-data\") pod \"bef45940-0556-4ff7-b16c-bdab174e7f51\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.499740 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-scripts\") pod \"bef45940-0556-4ff7-b16c-bdab174e7f51\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.499825 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bef45940-0556-4ff7-b16c-bdab174e7f51-log-httpd\") pod \"bef45940-0556-4ff7-b16c-bdab174e7f51\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.499911 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-sg-core-conf-yaml\") pod \"bef45940-0556-4ff7-b16c-bdab174e7f51\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.499966 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bef45940-0556-4ff7-b16c-bdab174e7f51-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bef45940-0556-4ff7-b16c-bdab174e7f51" (UID: "bef45940-0556-4ff7-b16c-bdab174e7f51"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.500048 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjqff\" (UniqueName: \"kubernetes.io/projected/bef45940-0556-4ff7-b16c-bdab174e7f51-kube-api-access-qjqff\") pod \"bef45940-0556-4ff7-b16c-bdab174e7f51\" (UID: \"bef45940-0556-4ff7-b16c-bdab174e7f51\") " Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.500283 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bef45940-0556-4ff7-b16c-bdab174e7f51-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bef45940-0556-4ff7-b16c-bdab174e7f51" (UID: "bef45940-0556-4ff7-b16c-bdab174e7f51"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.500663 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bef45940-0556-4ff7-b16c-bdab174e7f51-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.500749 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bef45940-0556-4ff7-b16c-bdab174e7f51-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.503434 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bef45940-0556-4ff7-b16c-bdab174e7f51-kube-api-access-qjqff" (OuterVolumeSpecName: "kube-api-access-qjqff") pod "bef45940-0556-4ff7-b16c-bdab174e7f51" (UID: "bef45940-0556-4ff7-b16c-bdab174e7f51"). InnerVolumeSpecName "kube-api-access-qjqff". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.515438 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-scripts" (OuterVolumeSpecName: "scripts") pod "bef45940-0556-4ff7-b16c-bdab174e7f51" (UID: "bef45940-0556-4ff7-b16c-bdab174e7f51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.518194 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bef45940-0556-4ff7-b16c-bdab174e7f51" (UID: "bef45940-0556-4ff7-b16c-bdab174e7f51"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.535410 4786 generic.go:334] "Generic (PLEG): container finished" podID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerID="25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4" exitCode=0 Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.535434 4786 generic.go:334] "Generic (PLEG): container finished" podID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerID="13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5" exitCode=2 Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.535441 4786 generic.go:334] "Generic (PLEG): container finished" podID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerID="af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b" exitCode=0 Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.535447 4786 generic.go:334] "Generic (PLEG): container finished" podID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerID="f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455" exitCode=0 Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.535463 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bef45940-0556-4ff7-b16c-bdab174e7f51","Type":"ContainerDied","Data":"25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4"} Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.535484 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bef45940-0556-4ff7-b16c-bdab174e7f51","Type":"ContainerDied","Data":"13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5"} Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.535493 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bef45940-0556-4ff7-b16c-bdab174e7f51","Type":"ContainerDied","Data":"af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b"} Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.535501 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bef45940-0556-4ff7-b16c-bdab174e7f51","Type":"ContainerDied","Data":"f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455"} Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.535509 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bef45940-0556-4ff7-b16c-bdab174e7f51","Type":"ContainerDied","Data":"badaff859dbabf262ab6c7ed28a1eb6c0c902e83d37bad543f518cf4af47ecbb"} Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.535544 4786 scope.go:117] "RemoveContainer" containerID="25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.536265 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.550805 4786 scope.go:117] "RemoveContainer" containerID="13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.554049 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bef45940-0556-4ff7-b16c-bdab174e7f51" (UID: "bef45940-0556-4ff7-b16c-bdab174e7f51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.564732 4786 scope.go:117] "RemoveContainer" containerID="af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.575008 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-config-data" (OuterVolumeSpecName: "config-data") pod "bef45940-0556-4ff7-b16c-bdab174e7f51" (UID: "bef45940-0556-4ff7-b16c-bdab174e7f51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.590157 4786 scope.go:117] "RemoveContainer" containerID="f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.602480 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.602503 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.602513 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.602533 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjqff\" (UniqueName: \"kubernetes.io/projected/bef45940-0556-4ff7-b16c-bdab174e7f51-kube-api-access-qjqff\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.602541 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bef45940-0556-4ff7-b16c-bdab174e7f51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.605056 4786 scope.go:117] "RemoveContainer" containerID="25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4" Oct 02 07:02:13 crc kubenswrapper[4786]: E1002 07:02:13.605417 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4\": container with ID starting with 25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4 not found: ID does not exist" containerID="25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.605445 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4"} err="failed to get container status \"25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4\": rpc error: code = NotFound desc = could not find container \"25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4\": container with ID starting with 25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4 not found: ID does not exist" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.605463 4786 scope.go:117] "RemoveContainer" containerID="13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5" Oct 02 07:02:13 crc kubenswrapper[4786]: E1002 07:02:13.605816 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5\": container with ID starting with 13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5 not found: ID does not exist" containerID="13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.605847 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5"} err="failed to get container status \"13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5\": rpc error: code = NotFound desc = could not find container \"13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5\": container with ID starting with 13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5 not found: ID does not exist" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.605867 4786 scope.go:117] "RemoveContainer" containerID="af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b" Oct 02 07:02:13 crc kubenswrapper[4786]: E1002 07:02:13.606151 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b\": container with ID starting with af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b not found: ID does not exist" containerID="af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.606177 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b"} err="failed to get container status \"af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b\": rpc error: code = NotFound desc = could not find container \"af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b\": container with ID starting with af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b not found: ID does not exist" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.606193 4786 scope.go:117] "RemoveContainer" containerID="f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455" Oct 02 07:02:13 crc kubenswrapper[4786]: E1002 07:02:13.606455 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455\": container with ID starting with f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455 not found: ID does not exist" containerID="f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.606480 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455"} err="failed to get container status \"f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455\": rpc error: code = NotFound desc = could not find container \"f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455\": container with ID starting with f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455 not found: ID does not exist" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.606494 4786 scope.go:117] "RemoveContainer" containerID="25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.606751 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4"} err="failed to get container status \"25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4\": rpc error: code = NotFound desc = could not find container \"25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4\": container with ID starting with 25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4 not found: ID does not exist" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.606770 4786 scope.go:117] "RemoveContainer" containerID="13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.606963 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5"} err="failed to get container status \"13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5\": rpc error: code = NotFound desc = could not find container \"13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5\": container with ID starting with 13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5 not found: ID does not exist" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.606983 4786 scope.go:117] "RemoveContainer" containerID="af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.607164 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b"} err="failed to get container status \"af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b\": rpc error: code = NotFound desc = could not find container \"af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b\": container with ID starting with af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b not found: ID does not exist" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.607182 4786 scope.go:117] "RemoveContainer" containerID="f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.607350 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455"} err="failed to get container status \"f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455\": rpc error: code = NotFound desc = could not find container \"f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455\": container with ID starting with f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455 not found: ID does not exist" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.607368 4786 scope.go:117] "RemoveContainer" containerID="25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.607536 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4"} err="failed to get container status \"25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4\": rpc error: code = NotFound desc = could not find container \"25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4\": container with ID starting with 25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4 not found: ID does not exist" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.607553 4786 scope.go:117] "RemoveContainer" containerID="13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.607835 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5"} err="failed to get container status \"13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5\": rpc error: code = NotFound desc = could not find container \"13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5\": container with ID starting with 13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5 not found: ID does not exist" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.607871 4786 scope.go:117] "RemoveContainer" containerID="af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.608112 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b"} err="failed to get container status \"af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b\": rpc error: code = NotFound desc = could not find container \"af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b\": container with ID starting with af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b not found: ID does not exist" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.608129 4786 scope.go:117] "RemoveContainer" containerID="f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.608307 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455"} err="failed to get container status \"f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455\": rpc error: code = NotFound desc = could not find container \"f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455\": container with ID starting with f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455 not found: ID does not exist" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.608324 4786 scope.go:117] "RemoveContainer" containerID="25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.608488 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4"} err="failed to get container status \"25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4\": rpc error: code = NotFound desc = could not find container \"25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4\": container with ID starting with 25c81c15b27af0fe65977ea38ae6f0b299198303769172b7fb2a20afafd593d4 not found: ID does not exist" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.608505 4786 scope.go:117] "RemoveContainer" containerID="13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.608741 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5"} err="failed to get container status \"13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5\": rpc error: code = NotFound desc = could not find container \"13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5\": container with ID starting with 13d9397396ca70e7767121e27b834e50559fd71de7d4daab25fd3e988863d6f5 not found: ID does not exist" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.608777 4786 scope.go:117] "RemoveContainer" containerID="af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.609015 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b"} err="failed to get container status \"af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b\": rpc error: code = NotFound desc = could not find container \"af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b\": container with ID starting with af0d2d7784a4616ccda13418720aaea6c911f3e88d290e1b820bbbc909c3119b not found: ID does not exist" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.609031 4786 scope.go:117] "RemoveContainer" containerID="f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.609253 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455"} err="failed to get container status \"f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455\": rpc error: code = NotFound desc = could not find container \"f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455\": container with ID starting with f86bf87e3381e706152d3f2a90a381afd91d22ab1925a739a710af86f2363455 not found: ID does not exist" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.876023 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.880669 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.895072 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:02:13 crc kubenswrapper[4786]: E1002 07:02:13.895345 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerName="ceilometer-notification-agent" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.895361 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerName="ceilometer-notification-agent" Oct 02 07:02:13 crc kubenswrapper[4786]: E1002 07:02:13.895371 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerName="proxy-httpd" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.895377 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerName="proxy-httpd" Oct 02 07:02:13 crc kubenswrapper[4786]: E1002 07:02:13.895389 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerName="sg-core" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.895395 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerName="sg-core" Oct 02 07:02:13 crc kubenswrapper[4786]: E1002 07:02:13.895406 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerName="ceilometer-central-agent" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.895411 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerName="ceilometer-central-agent" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.895587 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerName="proxy-httpd" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.895599 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerName="ceilometer-central-agent" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.895618 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerName="ceilometer-notification-agent" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.895624 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef45940-0556-4ff7-b16c-bdab174e7f51" containerName="sg-core" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.896904 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.899218 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.899392 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.904578 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.906345 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.906391 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.906424 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6frs\" (UniqueName: \"kubernetes.io/projected/e12af69d-1e05-48c1-84b0-71c41e97e382-kube-api-access-n6frs\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.906443 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e12af69d-1e05-48c1-84b0-71c41e97e382-log-httpd\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.906466 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e12af69d-1e05-48c1-84b0-71c41e97e382-run-httpd\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.906505 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-config-data\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:13 crc kubenswrapper[4786]: I1002 07:02:13.906537 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-scripts\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.008105 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e12af69d-1e05-48c1-84b0-71c41e97e382-run-httpd\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.008173 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-config-data\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.008199 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-scripts\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.008251 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.008287 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.008321 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6frs\" (UniqueName: \"kubernetes.io/projected/e12af69d-1e05-48c1-84b0-71c41e97e382-kube-api-access-n6frs\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.008341 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e12af69d-1e05-48c1-84b0-71c41e97e382-log-httpd\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.008652 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e12af69d-1e05-48c1-84b0-71c41e97e382-log-httpd\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.008843 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e12af69d-1e05-48c1-84b0-71c41e97e382-run-httpd\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.013491 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.013623 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.014204 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-config-data\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.015100 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-scripts\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.024662 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6frs\" (UniqueName: \"kubernetes.io/projected/e12af69d-1e05-48c1-84b0-71c41e97e382-kube-api-access-n6frs\") pod \"ceilometer-0\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " pod="openstack/ceilometer-0" Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.048484 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.048950 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="03227959-451d-483a-8d46-182fa634d20d" containerName="glance-httpd" containerID="cri-o://30af55bad49ce6d516e4692f2f4a9c9559edef50826ed2854f3933c610198513" gracePeriod=30 Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.049179 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="03227959-451d-483a-8d46-182fa634d20d" containerName="glance-log" containerID="cri-o://da11515d498d1c9cfc8aa4d5acb28c0a6441a3ef751e2022bd4b1dcfe0edd692" gracePeriod=30 Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.193993 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bef45940-0556-4ff7-b16c-bdab174e7f51" path="/var/lib/kubelet/pods/bef45940-0556-4ff7-b16c-bdab174e7f51/volumes" Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.209067 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.543339 4786 generic.go:334] "Generic (PLEG): container finished" podID="03227959-451d-483a-8d46-182fa634d20d" containerID="da11515d498d1c9cfc8aa4d5acb28c0a6441a3ef751e2022bd4b1dcfe0edd692" exitCode=143 Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.543446 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"03227959-451d-483a-8d46-182fa634d20d","Type":"ContainerDied","Data":"da11515d498d1c9cfc8aa4d5acb28c0a6441a3ef751e2022bd4b1dcfe0edd692"} Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.575420 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:02:14 crc kubenswrapper[4786]: W1002 07:02:14.579149 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode12af69d_1e05_48c1_84b0_71c41e97e382.slice/crio-9510160d483f852f6dac6864f911d4ce158749b44356c50351f9dbf4081a46af WatchSource:0}: Error finding container 9510160d483f852f6dac6864f911d4ce158749b44356c50351f9dbf4081a46af: Status 404 returned error can't find the container with id 9510160d483f852f6dac6864f911d4ce158749b44356c50351f9dbf4081a46af Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.637399 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.637598 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" containerName="glance-log" containerID="cri-o://240b2e8d8dc100bf56e7584a19a351332d0a487d5a4e54f65953e6adc4385c9e" gracePeriod=30 Oct 02 07:02:14 crc kubenswrapper[4786]: I1002 07:02:14.637682 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" containerName="glance-httpd" containerID="cri-o://472aa8696de11685976013caec961f1fa747d29b603235e575e9fa35d4c31de7" gracePeriod=30 Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.342314 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2ca4-account-create-qsdpx"] Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.343469 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2ca4-account-create-qsdpx" Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.344839 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.348793 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2ca4-account-create-qsdpx"] Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.443152 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.526053 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wnv2\" (UniqueName: \"kubernetes.io/projected/e4dcd6ee-cc67-4e35-ab8b-69b792c190b7-kube-api-access-8wnv2\") pod \"nova-api-2ca4-account-create-qsdpx\" (UID: \"e4dcd6ee-cc67-4e35-ab8b-69b792c190b7\") " pod="openstack/nova-api-2ca4-account-create-qsdpx" Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.530579 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-c952-account-create-tw57w"] Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.539899 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c952-account-create-tw57w" Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.540102 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c952-account-create-tw57w"] Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.545594 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.558923 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e12af69d-1e05-48c1-84b0-71c41e97e382","Type":"ContainerStarted","Data":"3e7abda0964f9f18d997acb8e7fdd24d5b024101e9199ae645f1cfe5601cf9ae"} Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.558965 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e12af69d-1e05-48c1-84b0-71c41e97e382","Type":"ContainerStarted","Data":"9510160d483f852f6dac6864f911d4ce158749b44356c50351f9dbf4081a46af"} Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.561349 4786 generic.go:334] "Generic (PLEG): container finished" podID="c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" containerID="240b2e8d8dc100bf56e7584a19a351332d0a487d5a4e54f65953e6adc4385c9e" exitCode=143 Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.561379 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9","Type":"ContainerDied","Data":"240b2e8d8dc100bf56e7584a19a351332d0a487d5a4e54f65953e6adc4385c9e"} Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.628217 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxlmk\" (UniqueName: \"kubernetes.io/projected/c26c9b38-23f6-440e-9667-a61014dc7d4b-kube-api-access-nxlmk\") pod \"nova-cell0-c952-account-create-tw57w\" (UID: \"c26c9b38-23f6-440e-9667-a61014dc7d4b\") " pod="openstack/nova-cell0-c952-account-create-tw57w" Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.628278 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wnv2\" (UniqueName: \"kubernetes.io/projected/e4dcd6ee-cc67-4e35-ab8b-69b792c190b7-kube-api-access-8wnv2\") pod \"nova-api-2ca4-account-create-qsdpx\" (UID: \"e4dcd6ee-cc67-4e35-ab8b-69b792c190b7\") " pod="openstack/nova-api-2ca4-account-create-qsdpx" Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.641366 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wnv2\" (UniqueName: \"kubernetes.io/projected/e4dcd6ee-cc67-4e35-ab8b-69b792c190b7-kube-api-access-8wnv2\") pod \"nova-api-2ca4-account-create-qsdpx\" (UID: \"e4dcd6ee-cc67-4e35-ab8b-69b792c190b7\") " pod="openstack/nova-api-2ca4-account-create-qsdpx" Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.709174 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2ca4-account-create-qsdpx" Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.731672 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxlmk\" (UniqueName: \"kubernetes.io/projected/c26c9b38-23f6-440e-9667-a61014dc7d4b-kube-api-access-nxlmk\") pod \"nova-cell0-c952-account-create-tw57w\" (UID: \"c26c9b38-23f6-440e-9667-a61014dc7d4b\") " pod="openstack/nova-cell0-c952-account-create-tw57w" Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.744845 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-87c5-account-create-ngfsv"] Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.745215 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxlmk\" (UniqueName: \"kubernetes.io/projected/c26c9b38-23f6-440e-9667-a61014dc7d4b-kube-api-access-nxlmk\") pod \"nova-cell0-c952-account-create-tw57w\" (UID: \"c26c9b38-23f6-440e-9667-a61014dc7d4b\") " pod="openstack/nova-cell0-c952-account-create-tw57w" Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.746101 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-87c5-account-create-ngfsv" Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.748039 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.752867 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-87c5-account-create-ngfsv"] Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.833561 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlc48\" (UniqueName: \"kubernetes.io/projected/53e54cd6-8543-4b97-a441-80ba704bb59c-kube-api-access-wlc48\") pod \"nova-cell1-87c5-account-create-ngfsv\" (UID: \"53e54cd6-8543-4b97-a441-80ba704bb59c\") " pod="openstack/nova-cell1-87c5-account-create-ngfsv" Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.864511 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c952-account-create-tw57w" Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.935418 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlc48\" (UniqueName: \"kubernetes.io/projected/53e54cd6-8543-4b97-a441-80ba704bb59c-kube-api-access-wlc48\") pod \"nova-cell1-87c5-account-create-ngfsv\" (UID: \"53e54cd6-8543-4b97-a441-80ba704bb59c\") " pod="openstack/nova-cell1-87c5-account-create-ngfsv" Oct 02 07:02:15 crc kubenswrapper[4786]: I1002 07:02:15.963849 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlc48\" (UniqueName: \"kubernetes.io/projected/53e54cd6-8543-4b97-a441-80ba704bb59c-kube-api-access-wlc48\") pod \"nova-cell1-87c5-account-create-ngfsv\" (UID: \"53e54cd6-8543-4b97-a441-80ba704bb59c\") " pod="openstack/nova-cell1-87c5-account-create-ngfsv" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.063126 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-87c5-account-create-ngfsv" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.108586 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2ca4-account-create-qsdpx"] Oct 02 07:02:16 crc kubenswrapper[4786]: W1002 07:02:16.122300 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4dcd6ee_cc67_4e35_ab8b_69b792c190b7.slice/crio-7204ac49b0cb45644bbf1473aa45bfea581ad60b1d75d6f2fa34bb9215bf455a WatchSource:0}: Error finding container 7204ac49b0cb45644bbf1473aa45bfea581ad60b1d75d6f2fa34bb9215bf455a: Status 404 returned error can't find the container with id 7204ac49b0cb45644bbf1473aa45bfea581ad60b1d75d6f2fa34bb9215bf455a Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.272555 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c952-account-create-tw57w"] Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.412014 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.444918 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4mnp\" (UniqueName: \"kubernetes.io/projected/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-kube-api-access-p4mnp\") pod \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.444962 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-combined-ca-bundle\") pod \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.445011 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-config-data\") pod \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.445035 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-etc-machine-id\") pod \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.445098 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-scripts\") pod \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.445116 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-logs\") pod \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.445133 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-config-data-custom\") pod \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\" (UID: \"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f\") " Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.446203 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f" (UID: "3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.448024 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-logs" (OuterVolumeSpecName: "logs") pod "3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f" (UID: "3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.452807 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-scripts" (OuterVolumeSpecName: "scripts") pod "3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f" (UID: "3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.454948 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f" (UID: "3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.481509 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-87c5-account-create-ngfsv"] Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.487024 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-kube-api-access-p4mnp" (OuterVolumeSpecName: "kube-api-access-p4mnp") pod "3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f" (UID: "3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f"). InnerVolumeSpecName "kube-api-access-p4mnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.495840 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f" (UID: "3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.523147 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-config-data" (OuterVolumeSpecName: "config-data") pod "3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f" (UID: "3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.546364 4786 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.546393 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4mnp\" (UniqueName: \"kubernetes.io/projected/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-kube-api-access-p4mnp\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.546403 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.546413 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.546421 4786 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.546428 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.546436 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f-logs\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.585095 4786 generic.go:334] "Generic (PLEG): container finished" podID="3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f" containerID="ea57ba5e4993418423b8057904af2097d16e89657a7543a43a94fbaf66b31b10" exitCode=137 Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.585143 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f","Type":"ContainerDied","Data":"ea57ba5e4993418423b8057904af2097d16e89657a7543a43a94fbaf66b31b10"} Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.585165 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f","Type":"ContainerDied","Data":"d32e17ee8dabf0e119212245e1a3267f9c4e8c0afc98a6b741e0ef2fa8c528fc"} Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.585181 4786 scope.go:117] "RemoveContainer" containerID="ea57ba5e4993418423b8057904af2097d16e89657a7543a43a94fbaf66b31b10" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.585285 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.590575 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e12af69d-1e05-48c1-84b0-71c41e97e382","Type":"ContainerStarted","Data":"8e17d6ca2471684949991a8ccb631d2309b2193e6e55f27ed05ec732307c3172"} Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.594547 4786 generic.go:334] "Generic (PLEG): container finished" podID="e4dcd6ee-cc67-4e35-ab8b-69b792c190b7" containerID="0a36b977fca2b2ccc4b427af522e8f149fd31eb13cda835da5bad44c02d54b08" exitCode=0 Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.594613 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2ca4-account-create-qsdpx" event={"ID":"e4dcd6ee-cc67-4e35-ab8b-69b792c190b7","Type":"ContainerDied","Data":"0a36b977fca2b2ccc4b427af522e8f149fd31eb13cda835da5bad44c02d54b08"} Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.594654 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2ca4-account-create-qsdpx" event={"ID":"e4dcd6ee-cc67-4e35-ab8b-69b792c190b7","Type":"ContainerStarted","Data":"7204ac49b0cb45644bbf1473aa45bfea581ad60b1d75d6f2fa34bb9215bf455a"} Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.597422 4786 generic.go:334] "Generic (PLEG): container finished" podID="c26c9b38-23f6-440e-9667-a61014dc7d4b" containerID="6b09b618b2f9da7abfa451c77cd2ec8607af56bb1db1d42cd935ae827737e605" exitCode=0 Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.597513 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c952-account-create-tw57w" event={"ID":"c26c9b38-23f6-440e-9667-a61014dc7d4b","Type":"ContainerDied","Data":"6b09b618b2f9da7abfa451c77cd2ec8607af56bb1db1d42cd935ae827737e605"} Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.597542 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c952-account-create-tw57w" event={"ID":"c26c9b38-23f6-440e-9667-a61014dc7d4b","Type":"ContainerStarted","Data":"af5172d29d2d9d8c82faf2903dfda395cf38c79f7208b6bf256d3f92b1052fa2"} Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.598832 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-87c5-account-create-ngfsv" event={"ID":"53e54cd6-8543-4b97-a441-80ba704bb59c","Type":"ContainerStarted","Data":"07025a84aa0f622582ab0c727054b37f1bde707ccb8fe9a023aa3ebad1b0083e"} Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.609774 4786 scope.go:117] "RemoveContainer" containerID="8f7d5d813a2fad03f9ea44d9214b8ff70a7c0907b42805cf27641efa6ffa28aa" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.626361 4786 scope.go:117] "RemoveContainer" containerID="ea57ba5e4993418423b8057904af2097d16e89657a7543a43a94fbaf66b31b10" Oct 02 07:02:16 crc kubenswrapper[4786]: E1002 07:02:16.627454 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea57ba5e4993418423b8057904af2097d16e89657a7543a43a94fbaf66b31b10\": container with ID starting with ea57ba5e4993418423b8057904af2097d16e89657a7543a43a94fbaf66b31b10 not found: ID does not exist" containerID="ea57ba5e4993418423b8057904af2097d16e89657a7543a43a94fbaf66b31b10" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.627509 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea57ba5e4993418423b8057904af2097d16e89657a7543a43a94fbaf66b31b10"} err="failed to get container status \"ea57ba5e4993418423b8057904af2097d16e89657a7543a43a94fbaf66b31b10\": rpc error: code = NotFound desc = could not find container \"ea57ba5e4993418423b8057904af2097d16e89657a7543a43a94fbaf66b31b10\": container with ID starting with ea57ba5e4993418423b8057904af2097d16e89657a7543a43a94fbaf66b31b10 not found: ID does not exist" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.627575 4786 scope.go:117] "RemoveContainer" containerID="8f7d5d813a2fad03f9ea44d9214b8ff70a7c0907b42805cf27641efa6ffa28aa" Oct 02 07:02:16 crc kubenswrapper[4786]: E1002 07:02:16.627932 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f7d5d813a2fad03f9ea44d9214b8ff70a7c0907b42805cf27641efa6ffa28aa\": container with ID starting with 8f7d5d813a2fad03f9ea44d9214b8ff70a7c0907b42805cf27641efa6ffa28aa not found: ID does not exist" containerID="8f7d5d813a2fad03f9ea44d9214b8ff70a7c0907b42805cf27641efa6ffa28aa" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.627958 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f7d5d813a2fad03f9ea44d9214b8ff70a7c0907b42805cf27641efa6ffa28aa"} err="failed to get container status \"8f7d5d813a2fad03f9ea44d9214b8ff70a7c0907b42805cf27641efa6ffa28aa\": rpc error: code = NotFound desc = could not find container \"8f7d5d813a2fad03f9ea44d9214b8ff70a7c0907b42805cf27641efa6ffa28aa\": container with ID starting with 8f7d5d813a2fad03f9ea44d9214b8ff70a7c0907b42805cf27641efa6ffa28aa not found: ID does not exist" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.629627 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-87c5-account-create-ngfsv" podStartSLOduration=1.629616669 podStartE2EDuration="1.629616669s" podCreationTimestamp="2025-10-02 07:02:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:02:16.617271831 +0000 UTC m=+946.738454972" watchObservedRunningTime="2025-10-02 07:02:16.629616669 +0000 UTC m=+946.750799800" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.653845 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.662124 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.667308 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 02 07:02:16 crc kubenswrapper[4786]: E1002 07:02:16.667664 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f" containerName="cinder-api-log" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.667682 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f" containerName="cinder-api-log" Oct 02 07:02:16 crc kubenswrapper[4786]: E1002 07:02:16.667708 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f" containerName="cinder-api" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.667714 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f" containerName="cinder-api" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.667928 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f" containerName="cinder-api" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.667957 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f" containerName="cinder-api-log" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.668859 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.670199 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.670419 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.670671 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.672273 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.748547 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/546a4a1f-7f71-4714-8d8c-a012948427bb-config-data-custom\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.748588 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/546a4a1f-7f71-4714-8d8c-a012948427bb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.748606 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546a4a1f-7f71-4714-8d8c-a012948427bb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.748652 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546a4a1f-7f71-4714-8d8c-a012948427bb-scripts\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.748673 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv7lj\" (UniqueName: \"kubernetes.io/projected/546a4a1f-7f71-4714-8d8c-a012948427bb-kube-api-access-kv7lj\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.748743 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/546a4a1f-7f71-4714-8d8c-a012948427bb-logs\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.748771 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/546a4a1f-7f71-4714-8d8c-a012948427bb-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.748819 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/546a4a1f-7f71-4714-8d8c-a012948427bb-public-tls-certs\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.748835 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546a4a1f-7f71-4714-8d8c-a012948427bb-config-data\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.853193 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/546a4a1f-7f71-4714-8d8c-a012948427bb-config-data-custom\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.853236 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/546a4a1f-7f71-4714-8d8c-a012948427bb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.853256 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546a4a1f-7f71-4714-8d8c-a012948427bb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.853298 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546a4a1f-7f71-4714-8d8c-a012948427bb-scripts\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.853319 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv7lj\" (UniqueName: \"kubernetes.io/projected/546a4a1f-7f71-4714-8d8c-a012948427bb-kube-api-access-kv7lj\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.853354 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/546a4a1f-7f71-4714-8d8c-a012948427bb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.853362 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/546a4a1f-7f71-4714-8d8c-a012948427bb-logs\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.853420 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/546a4a1f-7f71-4714-8d8c-a012948427bb-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.853514 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/546a4a1f-7f71-4714-8d8c-a012948427bb-public-tls-certs\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.853546 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546a4a1f-7f71-4714-8d8c-a012948427bb-config-data\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.853899 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/546a4a1f-7f71-4714-8d8c-a012948427bb-logs\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.856078 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/546a4a1f-7f71-4714-8d8c-a012948427bb-public-tls-certs\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.856925 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/546a4a1f-7f71-4714-8d8c-a012948427bb-config-data\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.857712 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/546a4a1f-7f71-4714-8d8c-a012948427bb-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.858026 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/546a4a1f-7f71-4714-8d8c-a012948427bb-scripts\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.858991 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/546a4a1f-7f71-4714-8d8c-a012948427bb-config-data-custom\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.859289 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/546a4a1f-7f71-4714-8d8c-a012948427bb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:16 crc kubenswrapper[4786]: I1002 07:02:16.866796 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv7lj\" (UniqueName: \"kubernetes.io/projected/546a4a1f-7f71-4714-8d8c-a012948427bb-kube-api-access-kv7lj\") pod \"cinder-api-0\" (UID: \"546a4a1f-7f71-4714-8d8c-a012948427bb\") " pod="openstack/cinder-api-0" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.008449 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.384822 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 07:02:17 crc kubenswrapper[4786]: W1002 07:02:17.389872 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod546a4a1f_7f71_4714_8d8c_a012948427bb.slice/crio-6bf499f273f05ac791a8463b69db38eb718b89650b746356fc4226d27b70c7e2 WatchSource:0}: Error finding container 6bf499f273f05ac791a8463b69db38eb718b89650b746356fc4226d27b70c7e2: Status 404 returned error can't find the container with id 6bf499f273f05ac791a8463b69db38eb718b89650b746356fc4226d27b70c7e2 Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.551143 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.565436 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03227959-451d-483a-8d46-182fa634d20d-logs\") pod \"03227959-451d-483a-8d46-182fa634d20d\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.566068 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-scripts\") pod \"03227959-451d-483a-8d46-182fa634d20d\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.566098 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-config-data\") pod \"03227959-451d-483a-8d46-182fa634d20d\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.567602 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwwcv\" (UniqueName: \"kubernetes.io/projected/03227959-451d-483a-8d46-182fa634d20d-kube-api-access-kwwcv\") pod \"03227959-451d-483a-8d46-182fa634d20d\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.567722 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-combined-ca-bundle\") pod \"03227959-451d-483a-8d46-182fa634d20d\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.567747 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-public-tls-certs\") pod \"03227959-451d-483a-8d46-182fa634d20d\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.567816 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03227959-451d-483a-8d46-182fa634d20d-httpd-run\") pod \"03227959-451d-483a-8d46-182fa634d20d\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.567861 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"03227959-451d-483a-8d46-182fa634d20d\" (UID: \"03227959-451d-483a-8d46-182fa634d20d\") " Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.565989 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03227959-451d-483a-8d46-182fa634d20d-logs" (OuterVolumeSpecName: "logs") pod "03227959-451d-483a-8d46-182fa634d20d" (UID: "03227959-451d-483a-8d46-182fa634d20d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.571237 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03227959-451d-483a-8d46-182fa634d20d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "03227959-451d-483a-8d46-182fa634d20d" (UID: "03227959-451d-483a-8d46-182fa634d20d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.575576 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-scripts" (OuterVolumeSpecName: "scripts") pod "03227959-451d-483a-8d46-182fa634d20d" (UID: "03227959-451d-483a-8d46-182fa634d20d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.577262 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "03227959-451d-483a-8d46-182fa634d20d" (UID: "03227959-451d-483a-8d46-182fa634d20d"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.587706 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03227959-451d-483a-8d46-182fa634d20d-kube-api-access-kwwcv" (OuterVolumeSpecName: "kube-api-access-kwwcv") pod "03227959-451d-483a-8d46-182fa634d20d" (UID: "03227959-451d-483a-8d46-182fa634d20d"). InnerVolumeSpecName "kube-api-access-kwwcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.599149 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03227959-451d-483a-8d46-182fa634d20d" (UID: "03227959-451d-483a-8d46-182fa634d20d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.614937 4786 generic.go:334] "Generic (PLEG): container finished" podID="03227959-451d-483a-8d46-182fa634d20d" containerID="30af55bad49ce6d516e4692f2f4a9c9559edef50826ed2854f3933c610198513" exitCode=0 Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.615354 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"03227959-451d-483a-8d46-182fa634d20d","Type":"ContainerDied","Data":"30af55bad49ce6d516e4692f2f4a9c9559edef50826ed2854f3933c610198513"} Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.615388 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"03227959-451d-483a-8d46-182fa634d20d","Type":"ContainerDied","Data":"03c4336f0ba0a13b4678678f78b7d2ef72ba0caf7b06a977e32559d4d741026e"} Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.615406 4786 scope.go:117] "RemoveContainer" containerID="30af55bad49ce6d516e4692f2f4a9c9559edef50826ed2854f3933c610198513" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.615488 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.621285 4786 generic.go:334] "Generic (PLEG): container finished" podID="53e54cd6-8543-4b97-a441-80ba704bb59c" containerID="d1dfa9a79e80682af0fb489de6befcb8fa54f1427032b3bd1129640b273c7e67" exitCode=0 Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.621824 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-87c5-account-create-ngfsv" event={"ID":"53e54cd6-8543-4b97-a441-80ba704bb59c","Type":"ContainerDied","Data":"d1dfa9a79e80682af0fb489de6befcb8fa54f1427032b3bd1129640b273c7e67"} Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.640462 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"546a4a1f-7f71-4714-8d8c-a012948427bb","Type":"ContainerStarted","Data":"6bf499f273f05ac791a8463b69db38eb718b89650b746356fc4226d27b70c7e2"} Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.650608 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "03227959-451d-483a-8d46-182fa634d20d" (UID: "03227959-451d-483a-8d46-182fa634d20d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.657124 4786 scope.go:117] "RemoveContainer" containerID="da11515d498d1c9cfc8aa4d5acb28c0a6441a3ef751e2022bd4b1dcfe0edd692" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.663880 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e12af69d-1e05-48c1-84b0-71c41e97e382","Type":"ContainerStarted","Data":"cd80e1cdb07f4f0f39786da54e51330f224309759ccb797a4f2572a8299f918c"} Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.667028 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-config-data" (OuterVolumeSpecName: "config-data") pod "03227959-451d-483a-8d46-182fa634d20d" (UID: "03227959-451d-483a-8d46-182fa634d20d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.671514 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwwcv\" (UniqueName: \"kubernetes.io/projected/03227959-451d-483a-8d46-182fa634d20d-kube-api-access-kwwcv\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.671557 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.671567 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.671576 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/03227959-451d-483a-8d46-182fa634d20d-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.671606 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.671617 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03227959-451d-483a-8d46-182fa634d20d-logs\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.671624 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.671632 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03227959-451d-483a-8d46-182fa634d20d-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.690010 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.701306 4786 scope.go:117] "RemoveContainer" containerID="30af55bad49ce6d516e4692f2f4a9c9559edef50826ed2854f3933c610198513" Oct 02 07:02:17 crc kubenswrapper[4786]: E1002 07:02:17.702117 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30af55bad49ce6d516e4692f2f4a9c9559edef50826ed2854f3933c610198513\": container with ID starting with 30af55bad49ce6d516e4692f2f4a9c9559edef50826ed2854f3933c610198513 not found: ID does not exist" containerID="30af55bad49ce6d516e4692f2f4a9c9559edef50826ed2854f3933c610198513" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.702141 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30af55bad49ce6d516e4692f2f4a9c9559edef50826ed2854f3933c610198513"} err="failed to get container status \"30af55bad49ce6d516e4692f2f4a9c9559edef50826ed2854f3933c610198513\": rpc error: code = NotFound desc = could not find container \"30af55bad49ce6d516e4692f2f4a9c9559edef50826ed2854f3933c610198513\": container with ID starting with 30af55bad49ce6d516e4692f2f4a9c9559edef50826ed2854f3933c610198513 not found: ID does not exist" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.702160 4786 scope.go:117] "RemoveContainer" containerID="da11515d498d1c9cfc8aa4d5acb28c0a6441a3ef751e2022bd4b1dcfe0edd692" Oct 02 07:02:17 crc kubenswrapper[4786]: E1002 07:02:17.702342 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da11515d498d1c9cfc8aa4d5acb28c0a6441a3ef751e2022bd4b1dcfe0edd692\": container with ID starting with da11515d498d1c9cfc8aa4d5acb28c0a6441a3ef751e2022bd4b1dcfe0edd692 not found: ID does not exist" containerID="da11515d498d1c9cfc8aa4d5acb28c0a6441a3ef751e2022bd4b1dcfe0edd692" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.702357 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da11515d498d1c9cfc8aa4d5acb28c0a6441a3ef751e2022bd4b1dcfe0edd692"} err="failed to get container status \"da11515d498d1c9cfc8aa4d5acb28c0a6441a3ef751e2022bd4b1dcfe0edd692\": rpc error: code = NotFound desc = could not find container \"da11515d498d1c9cfc8aa4d5acb28c0a6441a3ef751e2022bd4b1dcfe0edd692\": container with ID starting with da11515d498d1c9cfc8aa4d5acb28c0a6441a3ef751e2022bd4b1dcfe0edd692 not found: ID does not exist" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.773636 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.942756 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.954431 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.984084 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 07:02:17 crc kubenswrapper[4786]: E1002 07:02:17.984618 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03227959-451d-483a-8d46-182fa634d20d" containerName="glance-httpd" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.984636 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="03227959-451d-483a-8d46-182fa634d20d" containerName="glance-httpd" Oct 02 07:02:17 crc kubenswrapper[4786]: E1002 07:02:17.984648 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03227959-451d-483a-8d46-182fa634d20d" containerName="glance-log" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.984654 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="03227959-451d-483a-8d46-182fa634d20d" containerName="glance-log" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.984927 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="03227959-451d-483a-8d46-182fa634d20d" containerName="glance-httpd" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.984979 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="03227959-451d-483a-8d46-182fa634d20d" containerName="glance-log" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.990684 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.991096 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.994666 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 02 07:02:17 crc kubenswrapper[4786]: I1002 07:02:17.994910 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.182609 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbc5c046-7b05-4041-bb1f-a9851dde1c79-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.182856 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc5c046-7b05-4041-bb1f-a9851dde1c79-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.182929 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.182966 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc5c046-7b05-4041-bb1f-a9851dde1c79-scripts\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.182994 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct4qr\" (UniqueName: \"kubernetes.io/projected/cbc5c046-7b05-4041-bb1f-a9851dde1c79-kube-api-access-ct4qr\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.183011 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc5c046-7b05-4041-bb1f-a9851dde1c79-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.183037 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc5c046-7b05-4041-bb1f-a9851dde1c79-logs\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.183058 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc5c046-7b05-4041-bb1f-a9851dde1c79-config-data\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.183731 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c952-account-create-tw57w" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.195930 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03227959-451d-483a-8d46-182fa634d20d" path="/var/lib/kubelet/pods/03227959-451d-483a-8d46-182fa634d20d/volumes" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.196549 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f" path="/var/lib/kubelet/pods/3329ae9d-e8ad-4a70-8e15-427f3c4d5b1f/volumes" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.203871 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2ca4-account-create-qsdpx" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.282869 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.283549 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxlmk\" (UniqueName: \"kubernetes.io/projected/c26c9b38-23f6-440e-9667-a61014dc7d4b-kube-api-access-nxlmk\") pod \"c26c9b38-23f6-440e-9667-a61014dc7d4b\" (UID: \"c26c9b38-23f6-440e-9667-a61014dc7d4b\") " Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.283954 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc5c046-7b05-4041-bb1f-a9851dde1c79-scripts\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.284543 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct4qr\" (UniqueName: \"kubernetes.io/projected/cbc5c046-7b05-4041-bb1f-a9851dde1c79-kube-api-access-ct4qr\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.284631 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc5c046-7b05-4041-bb1f-a9851dde1c79-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.284974 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc5c046-7b05-4041-bb1f-a9851dde1c79-logs\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.285080 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc5c046-7b05-4041-bb1f-a9851dde1c79-config-data\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.285196 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbc5c046-7b05-4041-bb1f-a9851dde1c79-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.285284 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc5c046-7b05-4041-bb1f-a9851dde1c79-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.285408 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.285596 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.286009 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbc5c046-7b05-4041-bb1f-a9851dde1c79-logs\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.286319 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cbc5c046-7b05-4041-bb1f-a9851dde1c79-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.287927 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c26c9b38-23f6-440e-9667-a61014dc7d4b-kube-api-access-nxlmk" (OuterVolumeSpecName: "kube-api-access-nxlmk") pod "c26c9b38-23f6-440e-9667-a61014dc7d4b" (UID: "c26c9b38-23f6-440e-9667-a61014dc7d4b"). InnerVolumeSpecName "kube-api-access-nxlmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.288797 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc5c046-7b05-4041-bb1f-a9851dde1c79-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.290119 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc5c046-7b05-4041-bb1f-a9851dde1c79-scripts\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.290813 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc5c046-7b05-4041-bb1f-a9851dde1c79-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.301343 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct4qr\" (UniqueName: \"kubernetes.io/projected/cbc5c046-7b05-4041-bb1f-a9851dde1c79-kube-api-access-ct4qr\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.301966 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc5c046-7b05-4041-bb1f-a9851dde1c79-config-data\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.329183 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"cbc5c046-7b05-4041-bb1f-a9851dde1c79\") " pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.386247 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-config-data\") pod \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.386279 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.386329 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-combined-ca-bundle\") pod \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.386421 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-logs\") pod \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.386465 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wnv2\" (UniqueName: \"kubernetes.io/projected/e4dcd6ee-cc67-4e35-ab8b-69b792c190b7-kube-api-access-8wnv2\") pod \"e4dcd6ee-cc67-4e35-ab8b-69b792c190b7\" (UID: \"e4dcd6ee-cc67-4e35-ab8b-69b792c190b7\") " Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.386502 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-httpd-run\") pod \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.386515 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-scripts\") pod \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.386558 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt46b\" (UniqueName: \"kubernetes.io/projected/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-kube-api-access-nt46b\") pod \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.386577 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-internal-tls-certs\") pod \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\" (UID: \"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9\") " Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.386884 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxlmk\" (UniqueName: \"kubernetes.io/projected/c26c9b38-23f6-440e-9667-a61014dc7d4b-kube-api-access-nxlmk\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.390218 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-kube-api-access-nt46b" (OuterVolumeSpecName: "kube-api-access-nt46b") pod "c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" (UID: "c3dc169b-9901-4a6f-85dc-8c82c43ac1c9"). InnerVolumeSpecName "kube-api-access-nt46b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.390321 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" (UID: "c3dc169b-9901-4a6f-85dc-8c82c43ac1c9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.390341 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4dcd6ee-cc67-4e35-ab8b-69b792c190b7-kube-api-access-8wnv2" (OuterVolumeSpecName: "kube-api-access-8wnv2") pod "e4dcd6ee-cc67-4e35-ab8b-69b792c190b7" (UID: "e4dcd6ee-cc67-4e35-ab8b-69b792c190b7"). InnerVolumeSpecName "kube-api-access-8wnv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.390629 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-logs" (OuterVolumeSpecName: "logs") pod "c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" (UID: "c3dc169b-9901-4a6f-85dc-8c82c43ac1c9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.392788 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-scripts" (OuterVolumeSpecName: "scripts") pod "c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" (UID: "c3dc169b-9901-4a6f-85dc-8c82c43ac1c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.400072 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" (UID: "c3dc169b-9901-4a6f-85dc-8c82c43ac1c9"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.417204 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" (UID: "c3dc169b-9901-4a6f-85dc-8c82c43ac1c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.434444 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-config-data" (OuterVolumeSpecName: "config-data") pod "c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" (UID: "c3dc169b-9901-4a6f-85dc-8c82c43ac1c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.434951 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" (UID: "c3dc169b-9901-4a6f-85dc-8c82c43ac1c9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.488487 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.488523 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.488545 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.488556 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-logs\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.488566 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wnv2\" (UniqueName: \"kubernetes.io/projected/e4dcd6ee-cc67-4e35-ab8b-69b792c190b7-kube-api-access-8wnv2\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.488575 4786 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.488582 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.488590 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt46b\" (UniqueName: \"kubernetes.io/projected/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-kube-api-access-nt46b\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.488599 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.502250 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.590256 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.611682 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.672925 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c952-account-create-tw57w" event={"ID":"c26c9b38-23f6-440e-9667-a61014dc7d4b","Type":"ContainerDied","Data":"af5172d29d2d9d8c82faf2903dfda395cf38c79f7208b6bf256d3f92b1052fa2"} Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.672953 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af5172d29d2d9d8c82faf2903dfda395cf38c79f7208b6bf256d3f92b1052fa2" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.673002 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c952-account-create-tw57w" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.692827 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"546a4a1f-7f71-4714-8d8c-a012948427bb","Type":"ContainerStarted","Data":"d66b3098809cd317d2a1233bf1f7be7c4a17bd5d5b97ee1308f7771306a585d1"} Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.692870 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"546a4a1f-7f71-4714-8d8c-a012948427bb","Type":"ContainerStarted","Data":"1e7f2c3bb0684b72ddfd9f4086a0ecf39dc7bb9b25249f5d7bbabd220ba85089"} Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.693267 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.704347 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e12af69d-1e05-48c1-84b0-71c41e97e382","Type":"ContainerStarted","Data":"f18c9d0dce0fbf4564cb920e6a6ff1b963c3105e72a0210d3fa66b3939922c6c"} Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.704466 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerName="ceilometer-central-agent" containerID="cri-o://3e7abda0964f9f18d997acb8e7fdd24d5b024101e9199ae645f1cfe5601cf9ae" gracePeriod=30 Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.704565 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.704585 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerName="proxy-httpd" containerID="cri-o://f18c9d0dce0fbf4564cb920e6a6ff1b963c3105e72a0210d3fa66b3939922c6c" gracePeriod=30 Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.704619 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerName="ceilometer-notification-agent" containerID="cri-o://8e17d6ca2471684949991a8ccb631d2309b2193e6e55f27ed05ec732307c3172" gracePeriod=30 Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.704683 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerName="sg-core" containerID="cri-o://cd80e1cdb07f4f0f39786da54e51330f224309759ccb797a4f2572a8299f918c" gracePeriod=30 Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.720327 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.720315269 podStartE2EDuration="2.720315269s" podCreationTimestamp="2025-10-02 07:02:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:02:18.709022074 +0000 UTC m=+948.830205215" watchObservedRunningTime="2025-10-02 07:02:18.720315269 +0000 UTC m=+948.841498401" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.732260 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2ca4-account-create-qsdpx" event={"ID":"e4dcd6ee-cc67-4e35-ab8b-69b792c190b7","Type":"ContainerDied","Data":"7204ac49b0cb45644bbf1473aa45bfea581ad60b1d75d6f2fa34bb9215bf455a"} Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.732286 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7204ac49b0cb45644bbf1473aa45bfea581ad60b1d75d6f2fa34bb9215bf455a" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.732329 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2ca4-account-create-qsdpx" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.741519 4786 generic.go:334] "Generic (PLEG): container finished" podID="c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" containerID="472aa8696de11685976013caec961f1fa747d29b603235e575e9fa35d4c31de7" exitCode=0 Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.741712 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.742668 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9","Type":"ContainerDied","Data":"472aa8696de11685976013caec961f1fa747d29b603235e575e9fa35d4c31de7"} Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.742829 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c3dc169b-9901-4a6f-85dc-8c82c43ac1c9","Type":"ContainerDied","Data":"51b0574098bf921d91f6e77111fa094dc422f2f831edd09a79a93ce1eb8e4898"} Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.742857 4786 scope.go:117] "RemoveContainer" containerID="472aa8696de11685976013caec961f1fa747d29b603235e575e9fa35d4c31de7" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.743517 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.06675372 podStartE2EDuration="5.743504318s" podCreationTimestamp="2025-10-02 07:02:13 +0000 UTC" firstStartedPulling="2025-10-02 07:02:14.58071558 +0000 UTC m=+944.701898710" lastFinishedPulling="2025-10-02 07:02:18.257466177 +0000 UTC m=+948.378649308" observedRunningTime="2025-10-02 07:02:18.73966977 +0000 UTC m=+948.860852901" watchObservedRunningTime="2025-10-02 07:02:18.743504318 +0000 UTC m=+948.864687449" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.799792 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.807710 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.817707 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 07:02:18 crc kubenswrapper[4786]: E1002 07:02:18.818030 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" containerName="glance-log" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.818043 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" containerName="glance-log" Oct 02 07:02:18 crc kubenswrapper[4786]: E1002 07:02:18.818059 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4dcd6ee-cc67-4e35-ab8b-69b792c190b7" containerName="mariadb-account-create" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.818065 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4dcd6ee-cc67-4e35-ab8b-69b792c190b7" containerName="mariadb-account-create" Oct 02 07:02:18 crc kubenswrapper[4786]: E1002 07:02:18.818081 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" containerName="glance-httpd" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.818087 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" containerName="glance-httpd" Oct 02 07:02:18 crc kubenswrapper[4786]: E1002 07:02:18.818100 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c26c9b38-23f6-440e-9667-a61014dc7d4b" containerName="mariadb-account-create" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.818105 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c26c9b38-23f6-440e-9667-a61014dc7d4b" containerName="mariadb-account-create" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.818269 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c26c9b38-23f6-440e-9667-a61014dc7d4b" containerName="mariadb-account-create" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.818283 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4dcd6ee-cc67-4e35-ab8b-69b792c190b7" containerName="mariadb-account-create" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.818294 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" containerName="glance-log" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.818301 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" containerName="glance-httpd" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.822854 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.827365 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.835920 4786 scope.go:117] "RemoveContainer" containerID="240b2e8d8dc100bf56e7584a19a351332d0a487d5a4e54f65953e6adc4385c9e" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.836230 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.836277 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.959047 4786 scope.go:117] "RemoveContainer" containerID="472aa8696de11685976013caec961f1fa747d29b603235e575e9fa35d4c31de7" Oct 02 07:02:18 crc kubenswrapper[4786]: E1002 07:02:18.959610 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"472aa8696de11685976013caec961f1fa747d29b603235e575e9fa35d4c31de7\": container with ID starting with 472aa8696de11685976013caec961f1fa747d29b603235e575e9fa35d4c31de7 not found: ID does not exist" containerID="472aa8696de11685976013caec961f1fa747d29b603235e575e9fa35d4c31de7" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.959636 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"472aa8696de11685976013caec961f1fa747d29b603235e575e9fa35d4c31de7"} err="failed to get container status \"472aa8696de11685976013caec961f1fa747d29b603235e575e9fa35d4c31de7\": rpc error: code = NotFound desc = could not find container \"472aa8696de11685976013caec961f1fa747d29b603235e575e9fa35d4c31de7\": container with ID starting with 472aa8696de11685976013caec961f1fa747d29b603235e575e9fa35d4c31de7 not found: ID does not exist" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.959655 4786 scope.go:117] "RemoveContainer" containerID="240b2e8d8dc100bf56e7584a19a351332d0a487d5a4e54f65953e6adc4385c9e" Oct 02 07:02:18 crc kubenswrapper[4786]: E1002 07:02:18.959943 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"240b2e8d8dc100bf56e7584a19a351332d0a487d5a4e54f65953e6adc4385c9e\": container with ID starting with 240b2e8d8dc100bf56e7584a19a351332d0a487d5a4e54f65953e6adc4385c9e not found: ID does not exist" containerID="240b2e8d8dc100bf56e7584a19a351332d0a487d5a4e54f65953e6adc4385c9e" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.959961 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"240b2e8d8dc100bf56e7584a19a351332d0a487d5a4e54f65953e6adc4385c9e"} err="failed to get container status \"240b2e8d8dc100bf56e7584a19a351332d0a487d5a4e54f65953e6adc4385c9e\": rpc error: code = NotFound desc = could not find container \"240b2e8d8dc100bf56e7584a19a351332d0a487d5a4e54f65953e6adc4385c9e\": container with ID starting with 240b2e8d8dc100bf56e7584a19a351332d0a487d5a4e54f65953e6adc4385c9e not found: ID does not exist" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.999445 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.999491 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/326311ac-c390-46ad-bdd7-29ce60f094bc-logs\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.999512 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/326311ac-c390-46ad-bdd7-29ce60f094bc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.999554 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/326311ac-c390-46ad-bdd7-29ce60f094bc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.999571 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/326311ac-c390-46ad-bdd7-29ce60f094bc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.999595 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/326311ac-c390-46ad-bdd7-29ce60f094bc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.999621 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvzsh\" (UniqueName: \"kubernetes.io/projected/326311ac-c390-46ad-bdd7-29ce60f094bc-kube-api-access-rvzsh\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:18 crc kubenswrapper[4786]: I1002 07:02:18.999782 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326311ac-c390-46ad-bdd7-29ce60f094bc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.098247 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-87c5-account-create-ngfsv" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.101035 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.101083 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/326311ac-c390-46ad-bdd7-29ce60f094bc-logs\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.101109 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/326311ac-c390-46ad-bdd7-29ce60f094bc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.101142 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/326311ac-c390-46ad-bdd7-29ce60f094bc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.101171 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/326311ac-c390-46ad-bdd7-29ce60f094bc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.101202 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/326311ac-c390-46ad-bdd7-29ce60f094bc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.101238 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvzsh\" (UniqueName: \"kubernetes.io/projected/326311ac-c390-46ad-bdd7-29ce60f094bc-kube-api-access-rvzsh\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.101287 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326311ac-c390-46ad-bdd7-29ce60f094bc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.101385 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.101579 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/326311ac-c390-46ad-bdd7-29ce60f094bc-logs\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.102245 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/326311ac-c390-46ad-bdd7-29ce60f094bc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.106881 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/326311ac-c390-46ad-bdd7-29ce60f094bc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.107012 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/326311ac-c390-46ad-bdd7-29ce60f094bc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.107278 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/326311ac-c390-46ad-bdd7-29ce60f094bc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.108049 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/326311ac-c390-46ad-bdd7-29ce60f094bc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.120305 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvzsh\" (UniqueName: \"kubernetes.io/projected/326311ac-c390-46ad-bdd7-29ce60f094bc-kube-api-access-rvzsh\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.121876 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.142509 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"326311ac-c390-46ad-bdd7-29ce60f094bc\") " pod="openstack/glance-default-internal-api-0" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.202551 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlc48\" (UniqueName: \"kubernetes.io/projected/53e54cd6-8543-4b97-a441-80ba704bb59c-kube-api-access-wlc48\") pod \"53e54cd6-8543-4b97-a441-80ba704bb59c\" (UID: \"53e54cd6-8543-4b97-a441-80ba704bb59c\") " Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.205470 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53e54cd6-8543-4b97-a441-80ba704bb59c-kube-api-access-wlc48" (OuterVolumeSpecName: "kube-api-access-wlc48") pod "53e54cd6-8543-4b97-a441-80ba704bb59c" (UID: "53e54cd6-8543-4b97-a441-80ba704bb59c"). InnerVolumeSpecName "kube-api-access-wlc48". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.223660 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.305778 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlc48\" (UniqueName: \"kubernetes.io/projected/53e54cd6-8543-4b97-a441-80ba704bb59c-kube-api-access-wlc48\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.709194 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.752017 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"326311ac-c390-46ad-bdd7-29ce60f094bc","Type":"ContainerStarted","Data":"30a025ec583caf8b46efb2d9bac8c8305b46567d2f5e8f4b3367499061b680fb"} Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.754971 4786 generic.go:334] "Generic (PLEG): container finished" podID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerID="f18c9d0dce0fbf4564cb920e6a6ff1b963c3105e72a0210d3fa66b3939922c6c" exitCode=0 Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.754992 4786 generic.go:334] "Generic (PLEG): container finished" podID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerID="cd80e1cdb07f4f0f39786da54e51330f224309759ccb797a4f2572a8299f918c" exitCode=2 Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.754999 4786 generic.go:334] "Generic (PLEG): container finished" podID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerID="8e17d6ca2471684949991a8ccb631d2309b2193e6e55f27ed05ec732307c3172" exitCode=0 Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.754997 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e12af69d-1e05-48c1-84b0-71c41e97e382","Type":"ContainerDied","Data":"f18c9d0dce0fbf4564cb920e6a6ff1b963c3105e72a0210d3fa66b3939922c6c"} Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.755044 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e12af69d-1e05-48c1-84b0-71c41e97e382","Type":"ContainerDied","Data":"cd80e1cdb07f4f0f39786da54e51330f224309759ccb797a4f2572a8299f918c"} Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.755054 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e12af69d-1e05-48c1-84b0-71c41e97e382","Type":"ContainerDied","Data":"8e17d6ca2471684949991a8ccb631d2309b2193e6e55f27ed05ec732307c3172"} Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.757627 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-87c5-account-create-ngfsv" event={"ID":"53e54cd6-8543-4b97-a441-80ba704bb59c","Type":"ContainerDied","Data":"07025a84aa0f622582ab0c727054b37f1bde707ccb8fe9a023aa3ebad1b0083e"} Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.757650 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07025a84aa0f622582ab0c727054b37f1bde707ccb8fe9a023aa3ebad1b0083e" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.757663 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-87c5-account-create-ngfsv" Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.759835 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cbc5c046-7b05-4041-bb1f-a9851dde1c79","Type":"ContainerStarted","Data":"6d38ca432d2a36a57ebf2d99b817abc16a24c45dcd9e0f7ff175d21f642e3aa0"} Oct 02 07:02:19 crc kubenswrapper[4786]: I1002 07:02:19.759861 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cbc5c046-7b05-4041-bb1f-a9851dde1c79","Type":"ContainerStarted","Data":"fc80c2d9389c750412ab3943739ef38fc76a489cbf7715842e2c4c89939a76cd"} Oct 02 07:02:20 crc kubenswrapper[4786]: I1002 07:02:20.188546 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" path="/var/lib/kubelet/pods/c3dc169b-9901-4a6f-85dc-8c82c43ac1c9/volumes" Oct 02 07:02:20 crc kubenswrapper[4786]: I1002 07:02:20.768745 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cbc5c046-7b05-4041-bb1f-a9851dde1c79","Type":"ContainerStarted","Data":"658dcd6f3e63e67601dea34060167491f7ac73e7c0bc2253c39d756021755f42"} Oct 02 07:02:20 crc kubenswrapper[4786]: I1002 07:02:20.770298 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"326311ac-c390-46ad-bdd7-29ce60f094bc","Type":"ContainerStarted","Data":"84068c7af85a4e24450e486b7cd2030faf5b237c373475c590f37a78a9c0667f"} Oct 02 07:02:20 crc kubenswrapper[4786]: I1002 07:02:20.770334 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"326311ac-c390-46ad-bdd7-29ce60f094bc","Type":"ContainerStarted","Data":"e91efc459ef7b27276fd2d30117003e50a4287831d123263bc82a1c93be856d2"} Oct 02 07:02:20 crc kubenswrapper[4786]: I1002 07:02:20.796441 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.796425153 podStartE2EDuration="3.796425153s" podCreationTimestamp="2025-10-02 07:02:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:02:20.792882126 +0000 UTC m=+950.914065267" watchObservedRunningTime="2025-10-02 07:02:20.796425153 +0000 UTC m=+950.917608284" Oct 02 07:02:20 crc kubenswrapper[4786]: I1002 07:02:20.820481 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.820465205 podStartE2EDuration="2.820465205s" podCreationTimestamp="2025-10-02 07:02:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:02:20.813748025 +0000 UTC m=+950.934931176" watchObservedRunningTime="2025-10-02 07:02:20.820465205 +0000 UTC m=+950.941648326" Oct 02 07:02:20 crc kubenswrapper[4786]: I1002 07:02:20.926186 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rgckl"] Oct 02 07:02:20 crc kubenswrapper[4786]: E1002 07:02:20.926495 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53e54cd6-8543-4b97-a441-80ba704bb59c" containerName="mariadb-account-create" Oct 02 07:02:20 crc kubenswrapper[4786]: I1002 07:02:20.926511 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e54cd6-8543-4b97-a441-80ba704bb59c" containerName="mariadb-account-create" Oct 02 07:02:20 crc kubenswrapper[4786]: I1002 07:02:20.926702 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="53e54cd6-8543-4b97-a441-80ba704bb59c" containerName="mariadb-account-create" Oct 02 07:02:20 crc kubenswrapper[4786]: I1002 07:02:20.927150 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rgckl" Oct 02 07:02:20 crc kubenswrapper[4786]: I1002 07:02:20.928828 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gbm4v" Oct 02 07:02:20 crc kubenswrapper[4786]: I1002 07:02:20.931888 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 02 07:02:20 crc kubenswrapper[4786]: I1002 07:02:20.931909 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 07:02:20 crc kubenswrapper[4786]: I1002 07:02:20.944278 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rgckl"] Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.029093 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/311482dc-62f2-47f5-b1ea-005877d89e83-config-data\") pod \"nova-cell0-conductor-db-sync-rgckl\" (UID: \"311482dc-62f2-47f5-b1ea-005877d89e83\") " pod="openstack/nova-cell0-conductor-db-sync-rgckl" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.029201 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311482dc-62f2-47f5-b1ea-005877d89e83-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rgckl\" (UID: \"311482dc-62f2-47f5-b1ea-005877d89e83\") " pod="openstack/nova-cell0-conductor-db-sync-rgckl" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.029229 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q9hp\" (UniqueName: \"kubernetes.io/projected/311482dc-62f2-47f5-b1ea-005877d89e83-kube-api-access-9q9hp\") pod \"nova-cell0-conductor-db-sync-rgckl\" (UID: \"311482dc-62f2-47f5-b1ea-005877d89e83\") " pod="openstack/nova-cell0-conductor-db-sync-rgckl" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.029292 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/311482dc-62f2-47f5-b1ea-005877d89e83-scripts\") pod \"nova-cell0-conductor-db-sync-rgckl\" (UID: \"311482dc-62f2-47f5-b1ea-005877d89e83\") " pod="openstack/nova-cell0-conductor-db-sync-rgckl" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.131180 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/311482dc-62f2-47f5-b1ea-005877d89e83-config-data\") pod \"nova-cell0-conductor-db-sync-rgckl\" (UID: \"311482dc-62f2-47f5-b1ea-005877d89e83\") " pod="openstack/nova-cell0-conductor-db-sync-rgckl" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.131435 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311482dc-62f2-47f5-b1ea-005877d89e83-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rgckl\" (UID: \"311482dc-62f2-47f5-b1ea-005877d89e83\") " pod="openstack/nova-cell0-conductor-db-sync-rgckl" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.131463 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q9hp\" (UniqueName: \"kubernetes.io/projected/311482dc-62f2-47f5-b1ea-005877d89e83-kube-api-access-9q9hp\") pod \"nova-cell0-conductor-db-sync-rgckl\" (UID: \"311482dc-62f2-47f5-b1ea-005877d89e83\") " pod="openstack/nova-cell0-conductor-db-sync-rgckl" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.131554 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/311482dc-62f2-47f5-b1ea-005877d89e83-scripts\") pod \"nova-cell0-conductor-db-sync-rgckl\" (UID: \"311482dc-62f2-47f5-b1ea-005877d89e83\") " pod="openstack/nova-cell0-conductor-db-sync-rgckl" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.135683 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/311482dc-62f2-47f5-b1ea-005877d89e83-config-data\") pod \"nova-cell0-conductor-db-sync-rgckl\" (UID: \"311482dc-62f2-47f5-b1ea-005877d89e83\") " pod="openstack/nova-cell0-conductor-db-sync-rgckl" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.137162 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/311482dc-62f2-47f5-b1ea-005877d89e83-scripts\") pod \"nova-cell0-conductor-db-sync-rgckl\" (UID: \"311482dc-62f2-47f5-b1ea-005877d89e83\") " pod="openstack/nova-cell0-conductor-db-sync-rgckl" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.143402 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311482dc-62f2-47f5-b1ea-005877d89e83-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rgckl\" (UID: \"311482dc-62f2-47f5-b1ea-005877d89e83\") " pod="openstack/nova-cell0-conductor-db-sync-rgckl" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.143653 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q9hp\" (UniqueName: \"kubernetes.io/projected/311482dc-62f2-47f5-b1ea-005877d89e83-kube-api-access-9q9hp\") pod \"nova-cell0-conductor-db-sync-rgckl\" (UID: \"311482dc-62f2-47f5-b1ea-005877d89e83\") " pod="openstack/nova-cell0-conductor-db-sync-rgckl" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.240630 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rgckl" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.630083 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rgckl"] Oct 02 07:02:21 crc kubenswrapper[4786]: W1002 07:02:21.646880 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod311482dc_62f2_47f5_b1ea_005877d89e83.slice/crio-2405aba93522609d04f478059fb34d3ce0629f13d68c64409c67e7996231b5b0 WatchSource:0}: Error finding container 2405aba93522609d04f478059fb34d3ce0629f13d68c64409c67e7996231b5b0: Status 404 returned error can't find the container with id 2405aba93522609d04f478059fb34d3ce0629f13d68c64409c67e7996231b5b0 Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.723082 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.782282 4786 generic.go:334] "Generic (PLEG): container finished" podID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerID="3e7abda0964f9f18d997acb8e7fdd24d5b024101e9199ae645f1cfe5601cf9ae" exitCode=0 Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.782332 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.782355 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e12af69d-1e05-48c1-84b0-71c41e97e382","Type":"ContainerDied","Data":"3e7abda0964f9f18d997acb8e7fdd24d5b024101e9199ae645f1cfe5601cf9ae"} Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.782391 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e12af69d-1e05-48c1-84b0-71c41e97e382","Type":"ContainerDied","Data":"9510160d483f852f6dac6864f911d4ce158749b44356c50351f9dbf4081a46af"} Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.782409 4786 scope.go:117] "RemoveContainer" containerID="f18c9d0dce0fbf4564cb920e6a6ff1b963c3105e72a0210d3fa66b3939922c6c" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.783327 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rgckl" event={"ID":"311482dc-62f2-47f5-b1ea-005877d89e83","Type":"ContainerStarted","Data":"2405aba93522609d04f478059fb34d3ce0629f13d68c64409c67e7996231b5b0"} Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.801408 4786 scope.go:117] "RemoveContainer" containerID="cd80e1cdb07f4f0f39786da54e51330f224309759ccb797a4f2572a8299f918c" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.815477 4786 scope.go:117] "RemoveContainer" containerID="8e17d6ca2471684949991a8ccb631d2309b2193e6e55f27ed05ec732307c3172" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.834996 4786 scope.go:117] "RemoveContainer" containerID="3e7abda0964f9f18d997acb8e7fdd24d5b024101e9199ae645f1cfe5601cf9ae" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.843294 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-scripts\") pod \"e12af69d-1e05-48c1-84b0-71c41e97e382\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.843333 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e12af69d-1e05-48c1-84b0-71c41e97e382-log-httpd\") pod \"e12af69d-1e05-48c1-84b0-71c41e97e382\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.843419 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e12af69d-1e05-48c1-84b0-71c41e97e382-run-httpd\") pod \"e12af69d-1e05-48c1-84b0-71c41e97e382\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.843497 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-sg-core-conf-yaml\") pod \"e12af69d-1e05-48c1-84b0-71c41e97e382\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.843546 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-combined-ca-bundle\") pod \"e12af69d-1e05-48c1-84b0-71c41e97e382\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.843588 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6frs\" (UniqueName: \"kubernetes.io/projected/e12af69d-1e05-48c1-84b0-71c41e97e382-kube-api-access-n6frs\") pod \"e12af69d-1e05-48c1-84b0-71c41e97e382\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.843606 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-config-data\") pod \"e12af69d-1e05-48c1-84b0-71c41e97e382\" (UID: \"e12af69d-1e05-48c1-84b0-71c41e97e382\") " Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.843944 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e12af69d-1e05-48c1-84b0-71c41e97e382-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e12af69d-1e05-48c1-84b0-71c41e97e382" (UID: "e12af69d-1e05-48c1-84b0-71c41e97e382"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.844104 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e12af69d-1e05-48c1-84b0-71c41e97e382-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e12af69d-1e05-48c1-84b0-71c41e97e382" (UID: "e12af69d-1e05-48c1-84b0-71c41e97e382"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.844368 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e12af69d-1e05-48c1-84b0-71c41e97e382-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.844381 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e12af69d-1e05-48c1-84b0-71c41e97e382-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.847655 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e12af69d-1e05-48c1-84b0-71c41e97e382-kube-api-access-n6frs" (OuterVolumeSpecName: "kube-api-access-n6frs") pod "e12af69d-1e05-48c1-84b0-71c41e97e382" (UID: "e12af69d-1e05-48c1-84b0-71c41e97e382"). InnerVolumeSpecName "kube-api-access-n6frs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.852467 4786 scope.go:117] "RemoveContainer" containerID="f18c9d0dce0fbf4564cb920e6a6ff1b963c3105e72a0210d3fa66b3939922c6c" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.852555 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-scripts" (OuterVolumeSpecName: "scripts") pod "e12af69d-1e05-48c1-84b0-71c41e97e382" (UID: "e12af69d-1e05-48c1-84b0-71c41e97e382"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:21 crc kubenswrapper[4786]: E1002 07:02:21.852984 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f18c9d0dce0fbf4564cb920e6a6ff1b963c3105e72a0210d3fa66b3939922c6c\": container with ID starting with f18c9d0dce0fbf4564cb920e6a6ff1b963c3105e72a0210d3fa66b3939922c6c not found: ID does not exist" containerID="f18c9d0dce0fbf4564cb920e6a6ff1b963c3105e72a0210d3fa66b3939922c6c" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.853019 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f18c9d0dce0fbf4564cb920e6a6ff1b963c3105e72a0210d3fa66b3939922c6c"} err="failed to get container status \"f18c9d0dce0fbf4564cb920e6a6ff1b963c3105e72a0210d3fa66b3939922c6c\": rpc error: code = NotFound desc = could not find container \"f18c9d0dce0fbf4564cb920e6a6ff1b963c3105e72a0210d3fa66b3939922c6c\": container with ID starting with f18c9d0dce0fbf4564cb920e6a6ff1b963c3105e72a0210d3fa66b3939922c6c not found: ID does not exist" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.853042 4786 scope.go:117] "RemoveContainer" containerID="cd80e1cdb07f4f0f39786da54e51330f224309759ccb797a4f2572a8299f918c" Oct 02 07:02:21 crc kubenswrapper[4786]: E1002 07:02:21.853321 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd80e1cdb07f4f0f39786da54e51330f224309759ccb797a4f2572a8299f918c\": container with ID starting with cd80e1cdb07f4f0f39786da54e51330f224309759ccb797a4f2572a8299f918c not found: ID does not exist" containerID="cd80e1cdb07f4f0f39786da54e51330f224309759ccb797a4f2572a8299f918c" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.853351 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd80e1cdb07f4f0f39786da54e51330f224309759ccb797a4f2572a8299f918c"} err="failed to get container status \"cd80e1cdb07f4f0f39786da54e51330f224309759ccb797a4f2572a8299f918c\": rpc error: code = NotFound desc = could not find container \"cd80e1cdb07f4f0f39786da54e51330f224309759ccb797a4f2572a8299f918c\": container with ID starting with cd80e1cdb07f4f0f39786da54e51330f224309759ccb797a4f2572a8299f918c not found: ID does not exist" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.853383 4786 scope.go:117] "RemoveContainer" containerID="8e17d6ca2471684949991a8ccb631d2309b2193e6e55f27ed05ec732307c3172" Oct 02 07:02:21 crc kubenswrapper[4786]: E1002 07:02:21.853646 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e17d6ca2471684949991a8ccb631d2309b2193e6e55f27ed05ec732307c3172\": container with ID starting with 8e17d6ca2471684949991a8ccb631d2309b2193e6e55f27ed05ec732307c3172 not found: ID does not exist" containerID="8e17d6ca2471684949991a8ccb631d2309b2193e6e55f27ed05ec732307c3172" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.853681 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e17d6ca2471684949991a8ccb631d2309b2193e6e55f27ed05ec732307c3172"} err="failed to get container status \"8e17d6ca2471684949991a8ccb631d2309b2193e6e55f27ed05ec732307c3172\": rpc error: code = NotFound desc = could not find container \"8e17d6ca2471684949991a8ccb631d2309b2193e6e55f27ed05ec732307c3172\": container with ID starting with 8e17d6ca2471684949991a8ccb631d2309b2193e6e55f27ed05ec732307c3172 not found: ID does not exist" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.853707 4786 scope.go:117] "RemoveContainer" containerID="3e7abda0964f9f18d997acb8e7fdd24d5b024101e9199ae645f1cfe5601cf9ae" Oct 02 07:02:21 crc kubenswrapper[4786]: E1002 07:02:21.854072 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e7abda0964f9f18d997acb8e7fdd24d5b024101e9199ae645f1cfe5601cf9ae\": container with ID starting with 3e7abda0964f9f18d997acb8e7fdd24d5b024101e9199ae645f1cfe5601cf9ae not found: ID does not exist" containerID="3e7abda0964f9f18d997acb8e7fdd24d5b024101e9199ae645f1cfe5601cf9ae" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.854098 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e7abda0964f9f18d997acb8e7fdd24d5b024101e9199ae645f1cfe5601cf9ae"} err="failed to get container status \"3e7abda0964f9f18d997acb8e7fdd24d5b024101e9199ae645f1cfe5601cf9ae\": rpc error: code = NotFound desc = could not find container \"3e7abda0964f9f18d997acb8e7fdd24d5b024101e9199ae645f1cfe5601cf9ae\": container with ID starting with 3e7abda0964f9f18d997acb8e7fdd24d5b024101e9199ae645f1cfe5601cf9ae not found: ID does not exist" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.864265 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e12af69d-1e05-48c1-84b0-71c41e97e382" (UID: "e12af69d-1e05-48c1-84b0-71c41e97e382"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.892892 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e12af69d-1e05-48c1-84b0-71c41e97e382" (UID: "e12af69d-1e05-48c1-84b0-71c41e97e382"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.912133 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-config-data" (OuterVolumeSpecName: "config-data") pod "e12af69d-1e05-48c1-84b0-71c41e97e382" (UID: "e12af69d-1e05-48c1-84b0-71c41e97e382"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.945846 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.945892 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.945902 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6frs\" (UniqueName: \"kubernetes.io/projected/e12af69d-1e05-48c1-84b0-71c41e97e382-kube-api-access-n6frs\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.945912 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:21 crc kubenswrapper[4786]: I1002 07:02:21.945920 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e12af69d-1e05-48c1-84b0-71c41e97e382-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.105168 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.109667 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.120066 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:02:22 crc kubenswrapper[4786]: E1002 07:02:22.120382 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerName="ceilometer-central-agent" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.120400 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerName="ceilometer-central-agent" Oct 02 07:02:22 crc kubenswrapper[4786]: E1002 07:02:22.120423 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerName="sg-core" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.120429 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerName="sg-core" Oct 02 07:02:22 crc kubenswrapper[4786]: E1002 07:02:22.120449 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerName="proxy-httpd" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.120455 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerName="proxy-httpd" Oct 02 07:02:22 crc kubenswrapper[4786]: E1002 07:02:22.120468 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerName="ceilometer-notification-agent" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.120475 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerName="ceilometer-notification-agent" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.120672 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerName="ceilometer-central-agent" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.120707 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerName="proxy-httpd" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.120726 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerName="ceilometer-notification-agent" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.120735 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="e12af69d-1e05-48c1-84b0-71c41e97e382" containerName="sg-core" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.122078 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.124023 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.124201 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.129624 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.186439 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e12af69d-1e05-48c1-84b0-71c41e97e382" path="/var/lib/kubelet/pods/e12af69d-1e05-48c1-84b0-71c41e97e382/volumes" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.250235 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcdmf\" (UniqueName: \"kubernetes.io/projected/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-kube-api-access-xcdmf\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.250269 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-run-httpd\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.250290 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-log-httpd\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.250314 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.250525 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-config-data\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.250989 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-scripts\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.251182 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.352237 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcdmf\" (UniqueName: \"kubernetes.io/projected/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-kube-api-access-xcdmf\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.352268 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-run-httpd\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.352286 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-log-httpd\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.352324 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.352368 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-config-data\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.352427 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-scripts\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.352487 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.352602 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-run-httpd\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.352896 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-log-httpd\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.360484 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-config-data\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.360787 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.365846 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.370000 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-scripts\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.376349 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcdmf\" (UniqueName: \"kubernetes.io/projected/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-kube-api-access-xcdmf\") pod \"ceilometer-0\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.434376 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:02:22 crc kubenswrapper[4786]: I1002 07:02:22.816867 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:02:22 crc kubenswrapper[4786]: W1002 07:02:22.819430 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0fbbc36_d2d7_4c19_aec8_baa1430a45bb.slice/crio-1211e0249ec700ac84335622a787b935ac2460625f807cf683edea38fa1767f8 WatchSource:0}: Error finding container 1211e0249ec700ac84335622a787b935ac2460625f807cf683edea38fa1767f8: Status 404 returned error can't find the container with id 1211e0249ec700ac84335622a787b935ac2460625f807cf683edea38fa1767f8 Oct 02 07:02:23 crc kubenswrapper[4786]: I1002 07:02:23.805585 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb","Type":"ContainerStarted","Data":"367d9e28f41fff6f9ef9091dab0be56a6f1742b9ba147ece3671dbaa45f15fe1"} Oct 02 07:02:23 crc kubenswrapper[4786]: I1002 07:02:23.805784 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb","Type":"ContainerStarted","Data":"1211e0249ec700ac84335622a787b935ac2460625f807cf683edea38fa1767f8"} Oct 02 07:02:24 crc kubenswrapper[4786]: I1002 07:02:24.813342 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb","Type":"ContainerStarted","Data":"2a97632f1d6272d6d6ebec0eaa091587a9929aa2866b878c4dae6ca5eae9da17"} Oct 02 07:02:25 crc kubenswrapper[4786]: I1002 07:02:25.821792 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb","Type":"ContainerStarted","Data":"af7338b7964d8b1058688ec65332fae43b53ef005bb2d07de01d1279c4bbb4b9"} Oct 02 07:02:26 crc kubenswrapper[4786]: I1002 07:02:26.840566 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb","Type":"ContainerStarted","Data":"8e6199e159bd6a10e91dbeff8b9d3e1857df57445bcc7b1e602b9d937af3ba53"} Oct 02 07:02:26 crc kubenswrapper[4786]: I1002 07:02:26.840821 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 07:02:26 crc kubenswrapper[4786]: I1002 07:02:26.860089 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.208769211 podStartE2EDuration="4.860075819s" podCreationTimestamp="2025-10-02 07:02:22 +0000 UTC" firstStartedPulling="2025-10-02 07:02:22.821369738 +0000 UTC m=+952.942552870" lastFinishedPulling="2025-10-02 07:02:26.472676347 +0000 UTC m=+956.593859478" observedRunningTime="2025-10-02 07:02:26.855009057 +0000 UTC m=+956.976192198" watchObservedRunningTime="2025-10-02 07:02:26.860075819 +0000 UTC m=+956.981258949" Oct 02 07:02:27 crc kubenswrapper[4786]: I1002 07:02:27.497309 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:02:27 crc kubenswrapper[4786]: I1002 07:02:27.497526 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:02:27 crc kubenswrapper[4786]: I1002 07:02:27.497581 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 07:02:27 crc kubenswrapper[4786]: I1002 07:02:27.498181 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d7bde13c3a2f638d163652c020a2e40b2d8399d146317237502071d1e44c36be"} pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 07:02:27 crc kubenswrapper[4786]: I1002 07:02:27.498231 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" containerID="cri-o://d7bde13c3a2f638d163652c020a2e40b2d8399d146317237502071d1e44c36be" gracePeriod=600 Oct 02 07:02:27 crc kubenswrapper[4786]: I1002 07:02:27.847798 4786 generic.go:334] "Generic (PLEG): container finished" podID="79cb22df-4930-4aed-9108-1056074d1000" containerID="d7bde13c3a2f638d163652c020a2e40b2d8399d146317237502071d1e44c36be" exitCode=0 Oct 02 07:02:27 crc kubenswrapper[4786]: I1002 07:02:27.848841 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerDied","Data":"d7bde13c3a2f638d163652c020a2e40b2d8399d146317237502071d1e44c36be"} Oct 02 07:02:27 crc kubenswrapper[4786]: I1002 07:02:27.848888 4786 scope.go:117] "RemoveContainer" containerID="613104429ff7e56e4d88582bf64ddcf8603f93d2b0b8b15a934f56112fabd10d" Oct 02 07:02:28 crc kubenswrapper[4786]: I1002 07:02:28.581317 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 02 07:02:28 crc kubenswrapper[4786]: I1002 07:02:28.612230 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 07:02:28 crc kubenswrapper[4786]: I1002 07:02:28.612280 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 07:02:28 crc kubenswrapper[4786]: I1002 07:02:28.636776 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 07:02:28 crc kubenswrapper[4786]: I1002 07:02:28.645824 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 07:02:28 crc kubenswrapper[4786]: I1002 07:02:28.860358 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 07:02:28 crc kubenswrapper[4786]: I1002 07:02:28.860396 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 07:02:29 crc kubenswrapper[4786]: I1002 07:02:29.207175 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6qg47"] Oct 02 07:02:29 crc kubenswrapper[4786]: I1002 07:02:29.209124 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qg47" Oct 02 07:02:29 crc kubenswrapper[4786]: I1002 07:02:29.213767 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qg47"] Oct 02 07:02:29 crc kubenswrapper[4786]: I1002 07:02:29.232789 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 07:02:29 crc kubenswrapper[4786]: I1002 07:02:29.232824 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 07:02:29 crc kubenswrapper[4786]: I1002 07:02:29.257979 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 07:02:29 crc kubenswrapper[4786]: I1002 07:02:29.260230 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gdc7\" (UniqueName: \"kubernetes.io/projected/f5082899-e4d4-4228-8cb6-65f5a97c2bc7-kube-api-access-6gdc7\") pod \"redhat-marketplace-6qg47\" (UID: \"f5082899-e4d4-4228-8cb6-65f5a97c2bc7\") " pod="openshift-marketplace/redhat-marketplace-6qg47" Oct 02 07:02:29 crc kubenswrapper[4786]: I1002 07:02:29.260275 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5082899-e4d4-4228-8cb6-65f5a97c2bc7-utilities\") pod \"redhat-marketplace-6qg47\" (UID: \"f5082899-e4d4-4228-8cb6-65f5a97c2bc7\") " pod="openshift-marketplace/redhat-marketplace-6qg47" Oct 02 07:02:29 crc kubenswrapper[4786]: I1002 07:02:29.260458 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5082899-e4d4-4228-8cb6-65f5a97c2bc7-catalog-content\") pod \"redhat-marketplace-6qg47\" (UID: \"f5082899-e4d4-4228-8cb6-65f5a97c2bc7\") " pod="openshift-marketplace/redhat-marketplace-6qg47" Oct 02 07:02:29 crc kubenswrapper[4786]: I1002 07:02:29.270074 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 07:02:29 crc kubenswrapper[4786]: I1002 07:02:29.361852 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5082899-e4d4-4228-8cb6-65f5a97c2bc7-catalog-content\") pod \"redhat-marketplace-6qg47\" (UID: \"f5082899-e4d4-4228-8cb6-65f5a97c2bc7\") " pod="openshift-marketplace/redhat-marketplace-6qg47" Oct 02 07:02:29 crc kubenswrapper[4786]: I1002 07:02:29.362002 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gdc7\" (UniqueName: \"kubernetes.io/projected/f5082899-e4d4-4228-8cb6-65f5a97c2bc7-kube-api-access-6gdc7\") pod \"redhat-marketplace-6qg47\" (UID: \"f5082899-e4d4-4228-8cb6-65f5a97c2bc7\") " pod="openshift-marketplace/redhat-marketplace-6qg47" Oct 02 07:02:29 crc kubenswrapper[4786]: I1002 07:02:29.362031 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5082899-e4d4-4228-8cb6-65f5a97c2bc7-utilities\") pod \"redhat-marketplace-6qg47\" (UID: \"f5082899-e4d4-4228-8cb6-65f5a97c2bc7\") " pod="openshift-marketplace/redhat-marketplace-6qg47" Oct 02 07:02:29 crc kubenswrapper[4786]: I1002 07:02:29.362267 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5082899-e4d4-4228-8cb6-65f5a97c2bc7-catalog-content\") pod \"redhat-marketplace-6qg47\" (UID: \"f5082899-e4d4-4228-8cb6-65f5a97c2bc7\") " pod="openshift-marketplace/redhat-marketplace-6qg47" Oct 02 07:02:29 crc kubenswrapper[4786]: I1002 07:02:29.362405 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5082899-e4d4-4228-8cb6-65f5a97c2bc7-utilities\") pod \"redhat-marketplace-6qg47\" (UID: \"f5082899-e4d4-4228-8cb6-65f5a97c2bc7\") " pod="openshift-marketplace/redhat-marketplace-6qg47" Oct 02 07:02:29 crc kubenswrapper[4786]: I1002 07:02:29.385175 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gdc7\" (UniqueName: \"kubernetes.io/projected/f5082899-e4d4-4228-8cb6-65f5a97c2bc7-kube-api-access-6gdc7\") pod \"redhat-marketplace-6qg47\" (UID: \"f5082899-e4d4-4228-8cb6-65f5a97c2bc7\") " pod="openshift-marketplace/redhat-marketplace-6qg47" Oct 02 07:02:29 crc kubenswrapper[4786]: I1002 07:02:29.538835 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qg47" Oct 02 07:02:29 crc kubenswrapper[4786]: I1002 07:02:29.866137 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 07:02:29 crc kubenswrapper[4786]: I1002 07:02:29.866355 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 07:02:30 crc kubenswrapper[4786]: I1002 07:02:30.569557 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 07:02:30 crc kubenswrapper[4786]: I1002 07:02:30.613408 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 07:02:31 crc kubenswrapper[4786]: I1002 07:02:31.606050 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 07:02:31 crc kubenswrapper[4786]: I1002 07:02:31.606933 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 07:02:31 crc kubenswrapper[4786]: I1002 07:02:31.658367 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qg47"] Oct 02 07:02:31 crc kubenswrapper[4786]: I1002 07:02:31.884103 4786 generic.go:334] "Generic (PLEG): container finished" podID="f5082899-e4d4-4228-8cb6-65f5a97c2bc7" containerID="004eebaef020a44a5c2465bcf70a8ce5b7043913ca3d70372b22019431b9f805" exitCode=0 Oct 02 07:02:31 crc kubenswrapper[4786]: I1002 07:02:31.884156 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qg47" event={"ID":"f5082899-e4d4-4228-8cb6-65f5a97c2bc7","Type":"ContainerDied","Data":"004eebaef020a44a5c2465bcf70a8ce5b7043913ca3d70372b22019431b9f805"} Oct 02 07:02:31 crc kubenswrapper[4786]: I1002 07:02:31.884179 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qg47" event={"ID":"f5082899-e4d4-4228-8cb6-65f5a97c2bc7","Type":"ContainerStarted","Data":"caf8ed72646825703fb78e9a5940bb7847d5e24871ac1a3e9ea1002cc03c51c8"} Oct 02 07:02:31 crc kubenswrapper[4786]: I1002 07:02:31.887228 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerStarted","Data":"090157c59cd3f7df3b613a19c4694de2405b30b33d0b408048a029fe6c54264a"} Oct 02 07:02:31 crc kubenswrapper[4786]: I1002 07:02:31.891139 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rgckl" event={"ID":"311482dc-62f2-47f5-b1ea-005877d89e83","Type":"ContainerStarted","Data":"f2bea0cf64ded2b35bd67c0ce372b50cd2bd8e4626631055b016d8c9ce305f9e"} Oct 02 07:02:31 crc kubenswrapper[4786]: I1002 07:02:31.917175 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-rgckl" podStartSLOduration=2.301978431 podStartE2EDuration="11.917162577s" podCreationTimestamp="2025-10-02 07:02:20 +0000 UTC" firstStartedPulling="2025-10-02 07:02:21.648667045 +0000 UTC m=+951.769850176" lastFinishedPulling="2025-10-02 07:02:31.263851191 +0000 UTC m=+961.385034322" observedRunningTime="2025-10-02 07:02:31.91204418 +0000 UTC m=+962.033227321" watchObservedRunningTime="2025-10-02 07:02:31.917162577 +0000 UTC m=+962.038345709" Oct 02 07:02:32 crc kubenswrapper[4786]: I1002 07:02:32.899267 4786 generic.go:334] "Generic (PLEG): container finished" podID="f5082899-e4d4-4228-8cb6-65f5a97c2bc7" containerID="bee5eb57b4a597a7a3734e4993c0d875275ec6c8abf244fb17a310507fc6a301" exitCode=0 Oct 02 07:02:32 crc kubenswrapper[4786]: I1002 07:02:32.899336 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qg47" event={"ID":"f5082899-e4d4-4228-8cb6-65f5a97c2bc7","Type":"ContainerDied","Data":"bee5eb57b4a597a7a3734e4993c0d875275ec6c8abf244fb17a310507fc6a301"} Oct 02 07:02:33 crc kubenswrapper[4786]: I1002 07:02:33.908246 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qg47" event={"ID":"f5082899-e4d4-4228-8cb6-65f5a97c2bc7","Type":"ContainerStarted","Data":"3a3529051142c5b504c69b335b97ebfb45f2aaa53c2ad825f708add66d2e8a86"} Oct 02 07:02:33 crc kubenswrapper[4786]: I1002 07:02:33.926960 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6qg47" podStartSLOduration=3.363757233 podStartE2EDuration="4.926947037s" podCreationTimestamp="2025-10-02 07:02:29 +0000 UTC" firstStartedPulling="2025-10-02 07:02:31.88591537 +0000 UTC m=+962.007098492" lastFinishedPulling="2025-10-02 07:02:33.449105164 +0000 UTC m=+963.570288296" observedRunningTime="2025-10-02 07:02:33.923316904 +0000 UTC m=+964.044500035" watchObservedRunningTime="2025-10-02 07:02:33.926947037 +0000 UTC m=+964.048130168" Oct 02 07:02:33 crc kubenswrapper[4786]: I1002 07:02:33.994851 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sdlr9"] Oct 02 07:02:33 crc kubenswrapper[4786]: I1002 07:02:33.996392 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sdlr9" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.007246 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sdlr9"] Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.043712 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5mng\" (UniqueName: \"kubernetes.io/projected/033778c0-6165-45d9-b4fc-87d0cadd03d1-kube-api-access-q5mng\") pod \"certified-operators-sdlr9\" (UID: \"033778c0-6165-45d9-b4fc-87d0cadd03d1\") " pod="openshift-marketplace/certified-operators-sdlr9" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.043777 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033778c0-6165-45d9-b4fc-87d0cadd03d1-catalog-content\") pod \"certified-operators-sdlr9\" (UID: \"033778c0-6165-45d9-b4fc-87d0cadd03d1\") " pod="openshift-marketplace/certified-operators-sdlr9" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.043854 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033778c0-6165-45d9-b4fc-87d0cadd03d1-utilities\") pod \"certified-operators-sdlr9\" (UID: \"033778c0-6165-45d9-b4fc-87d0cadd03d1\") " pod="openshift-marketplace/certified-operators-sdlr9" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.145799 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033778c0-6165-45d9-b4fc-87d0cadd03d1-utilities\") pod \"certified-operators-sdlr9\" (UID: \"033778c0-6165-45d9-b4fc-87d0cadd03d1\") " pod="openshift-marketplace/certified-operators-sdlr9" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.145914 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5mng\" (UniqueName: \"kubernetes.io/projected/033778c0-6165-45d9-b4fc-87d0cadd03d1-kube-api-access-q5mng\") pod \"certified-operators-sdlr9\" (UID: \"033778c0-6165-45d9-b4fc-87d0cadd03d1\") " pod="openshift-marketplace/certified-operators-sdlr9" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.145981 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033778c0-6165-45d9-b4fc-87d0cadd03d1-catalog-content\") pod \"certified-operators-sdlr9\" (UID: \"033778c0-6165-45d9-b4fc-87d0cadd03d1\") " pod="openshift-marketplace/certified-operators-sdlr9" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.146429 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033778c0-6165-45d9-b4fc-87d0cadd03d1-catalog-content\") pod \"certified-operators-sdlr9\" (UID: \"033778c0-6165-45d9-b4fc-87d0cadd03d1\") " pod="openshift-marketplace/certified-operators-sdlr9" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.146615 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033778c0-6165-45d9-b4fc-87d0cadd03d1-utilities\") pod \"certified-operators-sdlr9\" (UID: \"033778c0-6165-45d9-b4fc-87d0cadd03d1\") " pod="openshift-marketplace/certified-operators-sdlr9" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.162566 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5mng\" (UniqueName: \"kubernetes.io/projected/033778c0-6165-45d9-b4fc-87d0cadd03d1-kube-api-access-q5mng\") pod \"certified-operators-sdlr9\" (UID: \"033778c0-6165-45d9-b4fc-87d0cadd03d1\") " pod="openshift-marketplace/certified-operators-sdlr9" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.193023 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mxt7n"] Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.194509 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxt7n" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.202956 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxt7n"] Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.247456 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4fbq\" (UniqueName: \"kubernetes.io/projected/c108f77a-18c2-4814-a4f8-edfcc456e1ae-kube-api-access-t4fbq\") pod \"community-operators-mxt7n\" (UID: \"c108f77a-18c2-4814-a4f8-edfcc456e1ae\") " pod="openshift-marketplace/community-operators-mxt7n" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.247641 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c108f77a-18c2-4814-a4f8-edfcc456e1ae-utilities\") pod \"community-operators-mxt7n\" (UID: \"c108f77a-18c2-4814-a4f8-edfcc456e1ae\") " pod="openshift-marketplace/community-operators-mxt7n" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.247745 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c108f77a-18c2-4814-a4f8-edfcc456e1ae-catalog-content\") pod \"community-operators-mxt7n\" (UID: \"c108f77a-18c2-4814-a4f8-edfcc456e1ae\") " pod="openshift-marketplace/community-operators-mxt7n" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.311997 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sdlr9" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.348964 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4fbq\" (UniqueName: \"kubernetes.io/projected/c108f77a-18c2-4814-a4f8-edfcc456e1ae-kube-api-access-t4fbq\") pod \"community-operators-mxt7n\" (UID: \"c108f77a-18c2-4814-a4f8-edfcc456e1ae\") " pod="openshift-marketplace/community-operators-mxt7n" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.349077 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c108f77a-18c2-4814-a4f8-edfcc456e1ae-utilities\") pod \"community-operators-mxt7n\" (UID: \"c108f77a-18c2-4814-a4f8-edfcc456e1ae\") " pod="openshift-marketplace/community-operators-mxt7n" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.349133 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c108f77a-18c2-4814-a4f8-edfcc456e1ae-catalog-content\") pod \"community-operators-mxt7n\" (UID: \"c108f77a-18c2-4814-a4f8-edfcc456e1ae\") " pod="openshift-marketplace/community-operators-mxt7n" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.349519 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c108f77a-18c2-4814-a4f8-edfcc456e1ae-utilities\") pod \"community-operators-mxt7n\" (UID: \"c108f77a-18c2-4814-a4f8-edfcc456e1ae\") " pod="openshift-marketplace/community-operators-mxt7n" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.349563 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c108f77a-18c2-4814-a4f8-edfcc456e1ae-catalog-content\") pod \"community-operators-mxt7n\" (UID: \"c108f77a-18c2-4814-a4f8-edfcc456e1ae\") " pod="openshift-marketplace/community-operators-mxt7n" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.364131 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4fbq\" (UniqueName: \"kubernetes.io/projected/c108f77a-18c2-4814-a4f8-edfcc456e1ae-kube-api-access-t4fbq\") pod \"community-operators-mxt7n\" (UID: \"c108f77a-18c2-4814-a4f8-edfcc456e1ae\") " pod="openshift-marketplace/community-operators-mxt7n" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.521068 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxt7n" Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.811129 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sdlr9"] Oct 02 07:02:34 crc kubenswrapper[4786]: I1002 07:02:34.933466 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdlr9" event={"ID":"033778c0-6165-45d9-b4fc-87d0cadd03d1","Type":"ContainerStarted","Data":"0a08ea7a3fc046a4d094012af49f7fe7cecea4489c1ac68b6aeafa56e0b7d6e7"} Oct 02 07:02:35 crc kubenswrapper[4786]: I1002 07:02:35.149676 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxt7n"] Oct 02 07:02:35 crc kubenswrapper[4786]: W1002 07:02:35.152398 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc108f77a_18c2_4814_a4f8_edfcc456e1ae.slice/crio-784a06f7da0bd51f761df61cb6cfddd7ee67c82ddbd786563cf5ef7f5ae4b434 WatchSource:0}: Error finding container 784a06f7da0bd51f761df61cb6cfddd7ee67c82ddbd786563cf5ef7f5ae4b434: Status 404 returned error can't find the container with id 784a06f7da0bd51f761df61cb6cfddd7ee67c82ddbd786563cf5ef7f5ae4b434 Oct 02 07:02:35 crc kubenswrapper[4786]: I1002 07:02:35.941010 4786 generic.go:334] "Generic (PLEG): container finished" podID="033778c0-6165-45d9-b4fc-87d0cadd03d1" containerID="b7083fc3104c11df5485d8c906913ca1fe23472a3a359b1699b10750a846412a" exitCode=0 Oct 02 07:02:35 crc kubenswrapper[4786]: I1002 07:02:35.941109 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdlr9" event={"ID":"033778c0-6165-45d9-b4fc-87d0cadd03d1","Type":"ContainerDied","Data":"b7083fc3104c11df5485d8c906913ca1fe23472a3a359b1699b10750a846412a"} Oct 02 07:02:35 crc kubenswrapper[4786]: I1002 07:02:35.943748 4786 generic.go:334] "Generic (PLEG): container finished" podID="c108f77a-18c2-4814-a4f8-edfcc456e1ae" containerID="f323c27a27b58bf53d356f3ddedbc4fa894a4ab73e0f61816f1242114cc3cdb9" exitCode=0 Oct 02 07:02:35 crc kubenswrapper[4786]: I1002 07:02:35.943802 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxt7n" event={"ID":"c108f77a-18c2-4814-a4f8-edfcc456e1ae","Type":"ContainerDied","Data":"f323c27a27b58bf53d356f3ddedbc4fa894a4ab73e0f61816f1242114cc3cdb9"} Oct 02 07:02:35 crc kubenswrapper[4786]: I1002 07:02:35.943827 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxt7n" event={"ID":"c108f77a-18c2-4814-a4f8-edfcc456e1ae","Type":"ContainerStarted","Data":"784a06f7da0bd51f761df61cb6cfddd7ee67c82ddbd786563cf5ef7f5ae4b434"} Oct 02 07:02:36 crc kubenswrapper[4786]: I1002 07:02:36.968444 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxt7n" event={"ID":"c108f77a-18c2-4814-a4f8-edfcc456e1ae","Type":"ContainerStarted","Data":"1ed1846ddf260957b11c4239a5d71c2a2743e9ddbd2945c12a28de2960d019e3"} Oct 02 07:02:37 crc kubenswrapper[4786]: I1002 07:02:37.980087 4786 generic.go:334] "Generic (PLEG): container finished" podID="311482dc-62f2-47f5-b1ea-005877d89e83" containerID="f2bea0cf64ded2b35bd67c0ce372b50cd2bd8e4626631055b016d8c9ce305f9e" exitCode=0 Oct 02 07:02:37 crc kubenswrapper[4786]: I1002 07:02:37.980140 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rgckl" event={"ID":"311482dc-62f2-47f5-b1ea-005877d89e83","Type":"ContainerDied","Data":"f2bea0cf64ded2b35bd67c0ce372b50cd2bd8e4626631055b016d8c9ce305f9e"} Oct 02 07:02:37 crc kubenswrapper[4786]: I1002 07:02:37.983297 4786 generic.go:334] "Generic (PLEG): container finished" podID="c108f77a-18c2-4814-a4f8-edfcc456e1ae" containerID="1ed1846ddf260957b11c4239a5d71c2a2743e9ddbd2945c12a28de2960d019e3" exitCode=0 Oct 02 07:02:37 crc kubenswrapper[4786]: I1002 07:02:37.983334 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxt7n" event={"ID":"c108f77a-18c2-4814-a4f8-edfcc456e1ae","Type":"ContainerDied","Data":"1ed1846ddf260957b11c4239a5d71c2a2743e9ddbd2945c12a28de2960d019e3"} Oct 02 07:02:39 crc kubenswrapper[4786]: I1002 07:02:39.539743 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6qg47" Oct 02 07:02:39 crc kubenswrapper[4786]: I1002 07:02:39.539965 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6qg47" Oct 02 07:02:39 crc kubenswrapper[4786]: I1002 07:02:39.570871 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6qg47" Oct 02 07:02:40 crc kubenswrapper[4786]: I1002 07:02:40.026253 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6qg47" Oct 02 07:02:40 crc kubenswrapper[4786]: I1002 07:02:40.140327 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rgckl" Oct 02 07:02:40 crc kubenswrapper[4786]: I1002 07:02:40.239985 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/311482dc-62f2-47f5-b1ea-005877d89e83-scripts\") pod \"311482dc-62f2-47f5-b1ea-005877d89e83\" (UID: \"311482dc-62f2-47f5-b1ea-005877d89e83\") " Oct 02 07:02:40 crc kubenswrapper[4786]: I1002 07:02:40.240718 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311482dc-62f2-47f5-b1ea-005877d89e83-combined-ca-bundle\") pod \"311482dc-62f2-47f5-b1ea-005877d89e83\" (UID: \"311482dc-62f2-47f5-b1ea-005877d89e83\") " Oct 02 07:02:40 crc kubenswrapper[4786]: I1002 07:02:40.240784 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q9hp\" (UniqueName: \"kubernetes.io/projected/311482dc-62f2-47f5-b1ea-005877d89e83-kube-api-access-9q9hp\") pod \"311482dc-62f2-47f5-b1ea-005877d89e83\" (UID: \"311482dc-62f2-47f5-b1ea-005877d89e83\") " Oct 02 07:02:40 crc kubenswrapper[4786]: I1002 07:02:40.240816 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/311482dc-62f2-47f5-b1ea-005877d89e83-config-data\") pod \"311482dc-62f2-47f5-b1ea-005877d89e83\" (UID: \"311482dc-62f2-47f5-b1ea-005877d89e83\") " Oct 02 07:02:40 crc kubenswrapper[4786]: I1002 07:02:40.244319 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/311482dc-62f2-47f5-b1ea-005877d89e83-scripts" (OuterVolumeSpecName: "scripts") pod "311482dc-62f2-47f5-b1ea-005877d89e83" (UID: "311482dc-62f2-47f5-b1ea-005877d89e83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:40 crc kubenswrapper[4786]: I1002 07:02:40.244501 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/311482dc-62f2-47f5-b1ea-005877d89e83-kube-api-access-9q9hp" (OuterVolumeSpecName: "kube-api-access-9q9hp") pod "311482dc-62f2-47f5-b1ea-005877d89e83" (UID: "311482dc-62f2-47f5-b1ea-005877d89e83"). InnerVolumeSpecName "kube-api-access-9q9hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:02:40 crc kubenswrapper[4786]: I1002 07:02:40.259631 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/311482dc-62f2-47f5-b1ea-005877d89e83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "311482dc-62f2-47f5-b1ea-005877d89e83" (UID: "311482dc-62f2-47f5-b1ea-005877d89e83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:40 crc kubenswrapper[4786]: I1002 07:02:40.259956 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/311482dc-62f2-47f5-b1ea-005877d89e83-config-data" (OuterVolumeSpecName: "config-data") pod "311482dc-62f2-47f5-b1ea-005877d89e83" (UID: "311482dc-62f2-47f5-b1ea-005877d89e83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:40 crc kubenswrapper[4786]: I1002 07:02:40.342983 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/311482dc-62f2-47f5-b1ea-005877d89e83-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:40 crc kubenswrapper[4786]: I1002 07:02:40.343203 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311482dc-62f2-47f5-b1ea-005877d89e83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:40 crc kubenswrapper[4786]: I1002 07:02:40.343215 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q9hp\" (UniqueName: \"kubernetes.io/projected/311482dc-62f2-47f5-b1ea-005877d89e83-kube-api-access-9q9hp\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:40 crc kubenswrapper[4786]: I1002 07:02:40.343224 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/311482dc-62f2-47f5-b1ea-005877d89e83-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.005823 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxt7n" event={"ID":"c108f77a-18c2-4814-a4f8-edfcc456e1ae","Type":"ContainerStarted","Data":"515db1b46e721b5ea1e42925d7dcfe483914e773d536d9dcb3459de963068b86"} Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.008500 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rgckl" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.008489 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rgckl" event={"ID":"311482dc-62f2-47f5-b1ea-005877d89e83","Type":"ContainerDied","Data":"2405aba93522609d04f478059fb34d3ce0629f13d68c64409c67e7996231b5b0"} Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.008611 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2405aba93522609d04f478059fb34d3ce0629f13d68c64409c67e7996231b5b0" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.010496 4786 generic.go:334] "Generic (PLEG): container finished" podID="033778c0-6165-45d9-b4fc-87d0cadd03d1" containerID="3171adffa0c3ae12bf8395d1a68d45cbadc4c7a1ecc0da515cb10f06c0fb9884" exitCode=0 Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.010550 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdlr9" event={"ID":"033778c0-6165-45d9-b4fc-87d0cadd03d1","Type":"ContainerDied","Data":"3171adffa0c3ae12bf8395d1a68d45cbadc4c7a1ecc0da515cb10f06c0fb9884"} Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.024988 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mxt7n" podStartSLOduration=2.329167311 podStartE2EDuration="7.024977285s" podCreationTimestamp="2025-10-02 07:02:34 +0000 UTC" firstStartedPulling="2025-10-02 07:02:35.94525899 +0000 UTC m=+966.066442121" lastFinishedPulling="2025-10-02 07:02:40.641068964 +0000 UTC m=+970.762252095" observedRunningTime="2025-10-02 07:02:41.018532858 +0000 UTC m=+971.139716000" watchObservedRunningTime="2025-10-02 07:02:41.024977285 +0000 UTC m=+971.146160417" Oct 02 07:02:41 crc kubenswrapper[4786]: E1002 07:02:41.127843 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod311482dc_62f2_47f5_b1ea_005877d89e83.slice/crio-2405aba93522609d04f478059fb34d3ce0629f13d68c64409c67e7996231b5b0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod311482dc_62f2_47f5_b1ea_005877d89e83.slice\": RecentStats: unable to find data in memory cache]" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.203265 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 07:02:41 crc kubenswrapper[4786]: E1002 07:02:41.203618 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="311482dc-62f2-47f5-b1ea-005877d89e83" containerName="nova-cell0-conductor-db-sync" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.203634 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="311482dc-62f2-47f5-b1ea-005877d89e83" containerName="nova-cell0-conductor-db-sync" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.203804 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="311482dc-62f2-47f5-b1ea-005877d89e83" containerName="nova-cell0-conductor-db-sync" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.204296 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.210838 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.211898 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.212169 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-gbm4v" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.257254 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ca74ccc-e756-4687-ad8e-7ffecd4b92f7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6ca74ccc-e756-4687-ad8e-7ffecd4b92f7\") " pod="openstack/nova-cell0-conductor-0" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.257475 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ca74ccc-e756-4687-ad8e-7ffecd4b92f7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6ca74ccc-e756-4687-ad8e-7ffecd4b92f7\") " pod="openstack/nova-cell0-conductor-0" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.257746 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jx5j\" (UniqueName: \"kubernetes.io/projected/6ca74ccc-e756-4687-ad8e-7ffecd4b92f7-kube-api-access-5jx5j\") pod \"nova-cell0-conductor-0\" (UID: \"6ca74ccc-e756-4687-ad8e-7ffecd4b92f7\") " pod="openstack/nova-cell0-conductor-0" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.358506 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jx5j\" (UniqueName: \"kubernetes.io/projected/6ca74ccc-e756-4687-ad8e-7ffecd4b92f7-kube-api-access-5jx5j\") pod \"nova-cell0-conductor-0\" (UID: \"6ca74ccc-e756-4687-ad8e-7ffecd4b92f7\") " pod="openstack/nova-cell0-conductor-0" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.358564 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ca74ccc-e756-4687-ad8e-7ffecd4b92f7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6ca74ccc-e756-4687-ad8e-7ffecd4b92f7\") " pod="openstack/nova-cell0-conductor-0" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.358647 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ca74ccc-e756-4687-ad8e-7ffecd4b92f7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6ca74ccc-e756-4687-ad8e-7ffecd4b92f7\") " pod="openstack/nova-cell0-conductor-0" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.363888 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ca74ccc-e756-4687-ad8e-7ffecd4b92f7-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6ca74ccc-e756-4687-ad8e-7ffecd4b92f7\") " pod="openstack/nova-cell0-conductor-0" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.363914 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ca74ccc-e756-4687-ad8e-7ffecd4b92f7-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6ca74ccc-e756-4687-ad8e-7ffecd4b92f7\") " pod="openstack/nova-cell0-conductor-0" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.371014 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jx5j\" (UniqueName: \"kubernetes.io/projected/6ca74ccc-e756-4687-ad8e-7ffecd4b92f7-kube-api-access-5jx5j\") pod \"nova-cell0-conductor-0\" (UID: \"6ca74ccc-e756-4687-ad8e-7ffecd4b92f7\") " pod="openstack/nova-cell0-conductor-0" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.520776 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 07:02:41 crc kubenswrapper[4786]: I1002 07:02:41.893293 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 07:02:41 crc kubenswrapper[4786]: W1002 07:02:41.902653 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ca74ccc_e756_4687_ad8e_7ffecd4b92f7.slice/crio-024db4a4f61f7c1b99105a275d3a328bb7dcd26779a8dec671e7b09c7dd199b2 WatchSource:0}: Error finding container 024db4a4f61f7c1b99105a275d3a328bb7dcd26779a8dec671e7b09c7dd199b2: Status 404 returned error can't find the container with id 024db4a4f61f7c1b99105a275d3a328bb7dcd26779a8dec671e7b09c7dd199b2 Oct 02 07:02:42 crc kubenswrapper[4786]: I1002 07:02:42.019267 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdlr9" event={"ID":"033778c0-6165-45d9-b4fc-87d0cadd03d1","Type":"ContainerStarted","Data":"dfe9b202192c72d05b9e0a0c5b2d72f90d59fa623198bb82d77d03ca1fc09036"} Oct 02 07:02:42 crc kubenswrapper[4786]: I1002 07:02:42.021369 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6ca74ccc-e756-4687-ad8e-7ffecd4b92f7","Type":"ContainerStarted","Data":"024db4a4f61f7c1b99105a275d3a328bb7dcd26779a8dec671e7b09c7dd199b2"} Oct 02 07:02:42 crc kubenswrapper[4786]: I1002 07:02:42.021486 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 02 07:02:42 crc kubenswrapper[4786]: I1002 07:02:42.034333 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sdlr9" podStartSLOduration=3.13102922 podStartE2EDuration="9.034321482s" podCreationTimestamp="2025-10-02 07:02:33 +0000 UTC" firstStartedPulling="2025-10-02 07:02:35.94245786 +0000 UTC m=+966.063640991" lastFinishedPulling="2025-10-02 07:02:41.845750122 +0000 UTC m=+971.966933253" observedRunningTime="2025-10-02 07:02:42.03053233 +0000 UTC m=+972.151715471" watchObservedRunningTime="2025-10-02 07:02:42.034321482 +0000 UTC m=+972.155504613" Oct 02 07:02:42 crc kubenswrapper[4786]: I1002 07:02:42.050302 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.050288608 podStartE2EDuration="1.050288608s" podCreationTimestamp="2025-10-02 07:02:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:02:42.047202279 +0000 UTC m=+972.168385411" watchObservedRunningTime="2025-10-02 07:02:42.050288608 +0000 UTC m=+972.171471739" Oct 02 07:02:43 crc kubenswrapper[4786]: I1002 07:02:43.028185 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6ca74ccc-e756-4687-ad8e-7ffecd4b92f7","Type":"ContainerStarted","Data":"965e3fdf097fb1915894ce8131f0e4e6c4d86f9e2d1eac5f3d88ab02fc8e1f6a"} Oct 02 07:02:43 crc kubenswrapper[4786]: I1002 07:02:43.784442 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qg47"] Oct 02 07:02:43 crc kubenswrapper[4786]: I1002 07:02:43.784649 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6qg47" podUID="f5082899-e4d4-4228-8cb6-65f5a97c2bc7" containerName="registry-server" containerID="cri-o://3a3529051142c5b504c69b335b97ebfb45f2aaa53c2ad825f708add66d2e8a86" gracePeriod=2 Oct 02 07:02:44 crc kubenswrapper[4786]: I1002 07:02:44.042475 4786 generic.go:334] "Generic (PLEG): container finished" podID="f5082899-e4d4-4228-8cb6-65f5a97c2bc7" containerID="3a3529051142c5b504c69b335b97ebfb45f2aaa53c2ad825f708add66d2e8a86" exitCode=0 Oct 02 07:02:44 crc kubenswrapper[4786]: I1002 07:02:44.042566 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qg47" event={"ID":"f5082899-e4d4-4228-8cb6-65f5a97c2bc7","Type":"ContainerDied","Data":"3a3529051142c5b504c69b335b97ebfb45f2aaa53c2ad825f708add66d2e8a86"} Oct 02 07:02:44 crc kubenswrapper[4786]: I1002 07:02:44.141487 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qg47" Oct 02 07:02:44 crc kubenswrapper[4786]: I1002 07:02:44.210847 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5082899-e4d4-4228-8cb6-65f5a97c2bc7-catalog-content\") pod \"f5082899-e4d4-4228-8cb6-65f5a97c2bc7\" (UID: \"f5082899-e4d4-4228-8cb6-65f5a97c2bc7\") " Oct 02 07:02:44 crc kubenswrapper[4786]: I1002 07:02:44.211000 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5082899-e4d4-4228-8cb6-65f5a97c2bc7-utilities\") pod \"f5082899-e4d4-4228-8cb6-65f5a97c2bc7\" (UID: \"f5082899-e4d4-4228-8cb6-65f5a97c2bc7\") " Oct 02 07:02:44 crc kubenswrapper[4786]: I1002 07:02:44.211043 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gdc7\" (UniqueName: \"kubernetes.io/projected/f5082899-e4d4-4228-8cb6-65f5a97c2bc7-kube-api-access-6gdc7\") pod \"f5082899-e4d4-4228-8cb6-65f5a97c2bc7\" (UID: \"f5082899-e4d4-4228-8cb6-65f5a97c2bc7\") " Oct 02 07:02:44 crc kubenswrapper[4786]: I1002 07:02:44.211347 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5082899-e4d4-4228-8cb6-65f5a97c2bc7-utilities" (OuterVolumeSpecName: "utilities") pod "f5082899-e4d4-4228-8cb6-65f5a97c2bc7" (UID: "f5082899-e4d4-4228-8cb6-65f5a97c2bc7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:02:44 crc kubenswrapper[4786]: I1002 07:02:44.211631 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5082899-e4d4-4228-8cb6-65f5a97c2bc7-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:44 crc kubenswrapper[4786]: I1002 07:02:44.215627 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5082899-e4d4-4228-8cb6-65f5a97c2bc7-kube-api-access-6gdc7" (OuterVolumeSpecName: "kube-api-access-6gdc7") pod "f5082899-e4d4-4228-8cb6-65f5a97c2bc7" (UID: "f5082899-e4d4-4228-8cb6-65f5a97c2bc7"). InnerVolumeSpecName "kube-api-access-6gdc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:02:44 crc kubenswrapper[4786]: I1002 07:02:44.219039 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5082899-e4d4-4228-8cb6-65f5a97c2bc7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5082899-e4d4-4228-8cb6-65f5a97c2bc7" (UID: "f5082899-e4d4-4228-8cb6-65f5a97c2bc7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:02:44 crc kubenswrapper[4786]: I1002 07:02:44.312790 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sdlr9" Oct 02 07:02:44 crc kubenswrapper[4786]: I1002 07:02:44.312857 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5082899-e4d4-4228-8cb6-65f5a97c2bc7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:44 crc kubenswrapper[4786]: I1002 07:02:44.312880 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gdc7\" (UniqueName: \"kubernetes.io/projected/f5082899-e4d4-4228-8cb6-65f5a97c2bc7-kube-api-access-6gdc7\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:44 crc kubenswrapper[4786]: I1002 07:02:44.313136 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sdlr9" Oct 02 07:02:44 crc kubenswrapper[4786]: I1002 07:02:44.347436 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sdlr9" Oct 02 07:02:44 crc kubenswrapper[4786]: I1002 07:02:44.522232 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mxt7n" Oct 02 07:02:44 crc kubenswrapper[4786]: I1002 07:02:44.522271 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mxt7n" Oct 02 07:02:45 crc kubenswrapper[4786]: I1002 07:02:45.049992 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6qg47" event={"ID":"f5082899-e4d4-4228-8cb6-65f5a97c2bc7","Type":"ContainerDied","Data":"caf8ed72646825703fb78e9a5940bb7847d5e24871ac1a3e9ea1002cc03c51c8"} Oct 02 07:02:45 crc kubenswrapper[4786]: I1002 07:02:45.050049 4786 scope.go:117] "RemoveContainer" containerID="3a3529051142c5b504c69b335b97ebfb45f2aaa53c2ad825f708add66d2e8a86" Oct 02 07:02:45 crc kubenswrapper[4786]: I1002 07:02:45.050012 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6qg47" Oct 02 07:02:45 crc kubenswrapper[4786]: I1002 07:02:45.064713 4786 scope.go:117] "RemoveContainer" containerID="bee5eb57b4a597a7a3734e4993c0d875275ec6c8abf244fb17a310507fc6a301" Oct 02 07:02:45 crc kubenswrapper[4786]: I1002 07:02:45.072963 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qg47"] Oct 02 07:02:45 crc kubenswrapper[4786]: I1002 07:02:45.077682 4786 scope.go:117] "RemoveContainer" containerID="004eebaef020a44a5c2465bcf70a8ce5b7043913ca3d70372b22019431b9f805" Oct 02 07:02:45 crc kubenswrapper[4786]: I1002 07:02:45.079420 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6qg47"] Oct 02 07:02:45 crc kubenswrapper[4786]: I1002 07:02:45.552130 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-mxt7n" podUID="c108f77a-18c2-4814-a4f8-edfcc456e1ae" containerName="registry-server" probeResult="failure" output=< Oct 02 07:02:45 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Oct 02 07:02:45 crc kubenswrapper[4786]: > Oct 02 07:02:46 crc kubenswrapper[4786]: I1002 07:02:46.088750 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sdlr9" Oct 02 07:02:46 crc kubenswrapper[4786]: I1002 07:02:46.186653 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5082899-e4d4-4228-8cb6-65f5a97c2bc7" path="/var/lib/kubelet/pods/f5082899-e4d4-4228-8cb6-65f5a97c2bc7/volumes" Oct 02 07:02:47 crc kubenswrapper[4786]: I1002 07:02:47.599931 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sdlr9"] Oct 02 07:02:47 crc kubenswrapper[4786]: I1002 07:02:47.985498 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bg46m"] Oct 02 07:02:47 crc kubenswrapper[4786]: I1002 07:02:47.985960 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bg46m" podUID="1e115b01-78ba-4811-a0e6-24cca7bbb0f7" containerName="registry-server" containerID="cri-o://0910d314d37a92c4d3965f421d8026d5f4d02a3c89671659e8a42d415d2c742c" gracePeriod=2 Oct 02 07:02:48 crc kubenswrapper[4786]: I1002 07:02:48.073623 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.145:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 02 07:02:48 crc kubenswrapper[4786]: I1002 07:02:48.073671 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="c3dc169b-9901-4a6f-85dc-8c82c43ac1c9" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.145:9292/healthcheck\": dial tcp 10.217.0.145:9292: i/o timeout" Oct 02 07:02:48 crc kubenswrapper[4786]: I1002 07:02:48.347001 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bg46m" Oct 02 07:02:48 crc kubenswrapper[4786]: I1002 07:02:48.468392 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e115b01-78ba-4811-a0e6-24cca7bbb0f7-catalog-content\") pod \"1e115b01-78ba-4811-a0e6-24cca7bbb0f7\" (UID: \"1e115b01-78ba-4811-a0e6-24cca7bbb0f7\") " Oct 02 07:02:48 crc kubenswrapper[4786]: I1002 07:02:48.468454 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e115b01-78ba-4811-a0e6-24cca7bbb0f7-utilities\") pod \"1e115b01-78ba-4811-a0e6-24cca7bbb0f7\" (UID: \"1e115b01-78ba-4811-a0e6-24cca7bbb0f7\") " Oct 02 07:02:48 crc kubenswrapper[4786]: I1002 07:02:48.468583 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2j6r\" (UniqueName: \"kubernetes.io/projected/1e115b01-78ba-4811-a0e6-24cca7bbb0f7-kube-api-access-x2j6r\") pod \"1e115b01-78ba-4811-a0e6-24cca7bbb0f7\" (UID: \"1e115b01-78ba-4811-a0e6-24cca7bbb0f7\") " Oct 02 07:02:48 crc kubenswrapper[4786]: I1002 07:02:48.468939 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e115b01-78ba-4811-a0e6-24cca7bbb0f7-utilities" (OuterVolumeSpecName: "utilities") pod "1e115b01-78ba-4811-a0e6-24cca7bbb0f7" (UID: "1e115b01-78ba-4811-a0e6-24cca7bbb0f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:02:48 crc kubenswrapper[4786]: I1002 07:02:48.469137 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e115b01-78ba-4811-a0e6-24cca7bbb0f7-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:48 crc kubenswrapper[4786]: I1002 07:02:48.472959 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e115b01-78ba-4811-a0e6-24cca7bbb0f7-kube-api-access-x2j6r" (OuterVolumeSpecName: "kube-api-access-x2j6r") pod "1e115b01-78ba-4811-a0e6-24cca7bbb0f7" (UID: "1e115b01-78ba-4811-a0e6-24cca7bbb0f7"). InnerVolumeSpecName "kube-api-access-x2j6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:02:48 crc kubenswrapper[4786]: I1002 07:02:48.499082 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e115b01-78ba-4811-a0e6-24cca7bbb0f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e115b01-78ba-4811-a0e6-24cca7bbb0f7" (UID: "1e115b01-78ba-4811-a0e6-24cca7bbb0f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:02:48 crc kubenswrapper[4786]: I1002 07:02:48.570167 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2j6r\" (UniqueName: \"kubernetes.io/projected/1e115b01-78ba-4811-a0e6-24cca7bbb0f7-kube-api-access-x2j6r\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:48 crc kubenswrapper[4786]: I1002 07:02:48.570320 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e115b01-78ba-4811-a0e6-24cca7bbb0f7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:49 crc kubenswrapper[4786]: I1002 07:02:49.077602 4786 generic.go:334] "Generic (PLEG): container finished" podID="1e115b01-78ba-4811-a0e6-24cca7bbb0f7" containerID="0910d314d37a92c4d3965f421d8026d5f4d02a3c89671659e8a42d415d2c742c" exitCode=0 Oct 02 07:02:49 crc kubenswrapper[4786]: I1002 07:02:49.077652 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bg46m" Oct 02 07:02:49 crc kubenswrapper[4786]: I1002 07:02:49.077667 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg46m" event={"ID":"1e115b01-78ba-4811-a0e6-24cca7bbb0f7","Type":"ContainerDied","Data":"0910d314d37a92c4d3965f421d8026d5f4d02a3c89671659e8a42d415d2c742c"} Oct 02 07:02:49 crc kubenswrapper[4786]: I1002 07:02:49.077971 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bg46m" event={"ID":"1e115b01-78ba-4811-a0e6-24cca7bbb0f7","Type":"ContainerDied","Data":"9a0534a9504e1d00ba4623c58bd5241b8005705771555410eff011db3e15a5b8"} Oct 02 07:02:49 crc kubenswrapper[4786]: I1002 07:02:49.077988 4786 scope.go:117] "RemoveContainer" containerID="0910d314d37a92c4d3965f421d8026d5f4d02a3c89671659e8a42d415d2c742c" Oct 02 07:02:49 crc kubenswrapper[4786]: I1002 07:02:49.097822 4786 scope.go:117] "RemoveContainer" containerID="ff15fe0eb64b2bdb59c434e895245793e0958257584defb9a61bfe74e33b937a" Oct 02 07:02:49 crc kubenswrapper[4786]: I1002 07:02:49.100986 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bg46m"] Oct 02 07:02:49 crc kubenswrapper[4786]: I1002 07:02:49.106641 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bg46m"] Oct 02 07:02:49 crc kubenswrapper[4786]: I1002 07:02:49.131286 4786 scope.go:117] "RemoveContainer" containerID="e9350ab49338cacb7a0e6868f0a628a3e67eb906fe1ddb356eb02f4c98c111b6" Oct 02 07:02:49 crc kubenswrapper[4786]: I1002 07:02:49.147936 4786 scope.go:117] "RemoveContainer" containerID="0910d314d37a92c4d3965f421d8026d5f4d02a3c89671659e8a42d415d2c742c" Oct 02 07:02:49 crc kubenswrapper[4786]: E1002 07:02:49.148299 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0910d314d37a92c4d3965f421d8026d5f4d02a3c89671659e8a42d415d2c742c\": container with ID starting with 0910d314d37a92c4d3965f421d8026d5f4d02a3c89671659e8a42d415d2c742c not found: ID does not exist" containerID="0910d314d37a92c4d3965f421d8026d5f4d02a3c89671659e8a42d415d2c742c" Oct 02 07:02:49 crc kubenswrapper[4786]: I1002 07:02:49.148337 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0910d314d37a92c4d3965f421d8026d5f4d02a3c89671659e8a42d415d2c742c"} err="failed to get container status \"0910d314d37a92c4d3965f421d8026d5f4d02a3c89671659e8a42d415d2c742c\": rpc error: code = NotFound desc = could not find container \"0910d314d37a92c4d3965f421d8026d5f4d02a3c89671659e8a42d415d2c742c\": container with ID starting with 0910d314d37a92c4d3965f421d8026d5f4d02a3c89671659e8a42d415d2c742c not found: ID does not exist" Oct 02 07:02:49 crc kubenswrapper[4786]: I1002 07:02:49.148360 4786 scope.go:117] "RemoveContainer" containerID="ff15fe0eb64b2bdb59c434e895245793e0958257584defb9a61bfe74e33b937a" Oct 02 07:02:49 crc kubenswrapper[4786]: E1002 07:02:49.148626 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff15fe0eb64b2bdb59c434e895245793e0958257584defb9a61bfe74e33b937a\": container with ID starting with ff15fe0eb64b2bdb59c434e895245793e0958257584defb9a61bfe74e33b937a not found: ID does not exist" containerID="ff15fe0eb64b2bdb59c434e895245793e0958257584defb9a61bfe74e33b937a" Oct 02 07:02:49 crc kubenswrapper[4786]: I1002 07:02:49.148656 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff15fe0eb64b2bdb59c434e895245793e0958257584defb9a61bfe74e33b937a"} err="failed to get container status \"ff15fe0eb64b2bdb59c434e895245793e0958257584defb9a61bfe74e33b937a\": rpc error: code = NotFound desc = could not find container \"ff15fe0eb64b2bdb59c434e895245793e0958257584defb9a61bfe74e33b937a\": container with ID starting with ff15fe0eb64b2bdb59c434e895245793e0958257584defb9a61bfe74e33b937a not found: ID does not exist" Oct 02 07:02:49 crc kubenswrapper[4786]: I1002 07:02:49.148698 4786 scope.go:117] "RemoveContainer" containerID="e9350ab49338cacb7a0e6868f0a628a3e67eb906fe1ddb356eb02f4c98c111b6" Oct 02 07:02:49 crc kubenswrapper[4786]: E1002 07:02:49.148982 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9350ab49338cacb7a0e6868f0a628a3e67eb906fe1ddb356eb02f4c98c111b6\": container with ID starting with e9350ab49338cacb7a0e6868f0a628a3e67eb906fe1ddb356eb02f4c98c111b6 not found: ID does not exist" containerID="e9350ab49338cacb7a0e6868f0a628a3e67eb906fe1ddb356eb02f4c98c111b6" Oct 02 07:02:49 crc kubenswrapper[4786]: I1002 07:02:49.149005 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9350ab49338cacb7a0e6868f0a628a3e67eb906fe1ddb356eb02f4c98c111b6"} err="failed to get container status \"e9350ab49338cacb7a0e6868f0a628a3e67eb906fe1ddb356eb02f4c98c111b6\": rpc error: code = NotFound desc = could not find container \"e9350ab49338cacb7a0e6868f0a628a3e67eb906fe1ddb356eb02f4c98c111b6\": container with ID starting with e9350ab49338cacb7a0e6868f0a628a3e67eb906fe1ddb356eb02f4c98c111b6 not found: ID does not exist" Oct 02 07:02:50 crc kubenswrapper[4786]: I1002 07:02:50.186517 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e115b01-78ba-4811-a0e6-24cca7bbb0f7" path="/var/lib/kubelet/pods/1e115b01-78ba-4811-a0e6-24cca7bbb0f7/volumes" Oct 02 07:02:51 crc kubenswrapper[4786]: I1002 07:02:51.540927 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.004925 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-nd9rf"] Oct 02 07:02:52 crc kubenswrapper[4786]: E1002 07:02:52.005464 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e115b01-78ba-4811-a0e6-24cca7bbb0f7" containerName="extract-content" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.005481 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e115b01-78ba-4811-a0e6-24cca7bbb0f7" containerName="extract-content" Oct 02 07:02:52 crc kubenswrapper[4786]: E1002 07:02:52.005495 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e115b01-78ba-4811-a0e6-24cca7bbb0f7" containerName="extract-utilities" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.005501 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e115b01-78ba-4811-a0e6-24cca7bbb0f7" containerName="extract-utilities" Oct 02 07:02:52 crc kubenswrapper[4786]: E1002 07:02:52.005512 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e115b01-78ba-4811-a0e6-24cca7bbb0f7" containerName="registry-server" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.005518 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e115b01-78ba-4811-a0e6-24cca7bbb0f7" containerName="registry-server" Oct 02 07:02:52 crc kubenswrapper[4786]: E1002 07:02:52.005527 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5082899-e4d4-4228-8cb6-65f5a97c2bc7" containerName="extract-content" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.005531 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5082899-e4d4-4228-8cb6-65f5a97c2bc7" containerName="extract-content" Oct 02 07:02:52 crc kubenswrapper[4786]: E1002 07:02:52.005544 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5082899-e4d4-4228-8cb6-65f5a97c2bc7" containerName="registry-server" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.005549 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5082899-e4d4-4228-8cb6-65f5a97c2bc7" containerName="registry-server" Oct 02 07:02:52 crc kubenswrapper[4786]: E1002 07:02:52.005558 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5082899-e4d4-4228-8cb6-65f5a97c2bc7" containerName="extract-utilities" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.005563 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5082899-e4d4-4228-8cb6-65f5a97c2bc7" containerName="extract-utilities" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.005948 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5082899-e4d4-4228-8cb6-65f5a97c2bc7" containerName="registry-server" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.005975 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e115b01-78ba-4811-a0e6-24cca7bbb0f7" containerName="registry-server" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.006417 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nd9rf" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.009031 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.009301 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.013684 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-nd9rf"] Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.116502 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2af71207-973b-401b-bba1-e78a23a043b5-config-data\") pod \"nova-cell0-cell-mapping-nd9rf\" (UID: \"2af71207-973b-401b-bba1-e78a23a043b5\") " pod="openstack/nova-cell0-cell-mapping-nd9rf" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.116819 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xrld\" (UniqueName: \"kubernetes.io/projected/2af71207-973b-401b-bba1-e78a23a043b5-kube-api-access-7xrld\") pod \"nova-cell0-cell-mapping-nd9rf\" (UID: \"2af71207-973b-401b-bba1-e78a23a043b5\") " pod="openstack/nova-cell0-cell-mapping-nd9rf" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.116890 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2af71207-973b-401b-bba1-e78a23a043b5-scripts\") pod \"nova-cell0-cell-mapping-nd9rf\" (UID: \"2af71207-973b-401b-bba1-e78a23a043b5\") " pod="openstack/nova-cell0-cell-mapping-nd9rf" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.116920 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af71207-973b-401b-bba1-e78a23a043b5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-nd9rf\" (UID: \"2af71207-973b-401b-bba1-e78a23a043b5\") " pod="openstack/nova-cell0-cell-mapping-nd9rf" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.121881 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.123120 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.124597 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.130248 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.200108 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.201322 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.205418 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.215976 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.218057 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2af71207-973b-401b-bba1-e78a23a043b5-config-data\") pod \"nova-cell0-cell-mapping-nd9rf\" (UID: \"2af71207-973b-401b-bba1-e78a23a043b5\") " pod="openstack/nova-cell0-cell-mapping-nd9rf" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.218114 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnsj5\" (UniqueName: \"kubernetes.io/projected/55fec9de-b3a5-4a32-9572-c49018b331f7-kube-api-access-gnsj5\") pod \"nova-api-0\" (UID: \"55fec9de-b3a5-4a32-9572-c49018b331f7\") " pod="openstack/nova-api-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.218179 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55fec9de-b3a5-4a32-9572-c49018b331f7-logs\") pod \"nova-api-0\" (UID: \"55fec9de-b3a5-4a32-9572-c49018b331f7\") " pod="openstack/nova-api-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.218207 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55fec9de-b3a5-4a32-9572-c49018b331f7-config-data\") pod \"nova-api-0\" (UID: \"55fec9de-b3a5-4a32-9572-c49018b331f7\") " pod="openstack/nova-api-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.218268 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xrld\" (UniqueName: \"kubernetes.io/projected/2af71207-973b-401b-bba1-e78a23a043b5-kube-api-access-7xrld\") pod \"nova-cell0-cell-mapping-nd9rf\" (UID: \"2af71207-973b-401b-bba1-e78a23a043b5\") " pod="openstack/nova-cell0-cell-mapping-nd9rf" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.218306 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55fec9de-b3a5-4a32-9572-c49018b331f7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"55fec9de-b3a5-4a32-9572-c49018b331f7\") " pod="openstack/nova-api-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.218323 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2af71207-973b-401b-bba1-e78a23a043b5-scripts\") pod \"nova-cell0-cell-mapping-nd9rf\" (UID: \"2af71207-973b-401b-bba1-e78a23a043b5\") " pod="openstack/nova-cell0-cell-mapping-nd9rf" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.218344 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af71207-973b-401b-bba1-e78a23a043b5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-nd9rf\" (UID: \"2af71207-973b-401b-bba1-e78a23a043b5\") " pod="openstack/nova-cell0-cell-mapping-nd9rf" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.225257 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af71207-973b-401b-bba1-e78a23a043b5-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-nd9rf\" (UID: \"2af71207-973b-401b-bba1-e78a23a043b5\") " pod="openstack/nova-cell0-cell-mapping-nd9rf" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.227979 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2af71207-973b-401b-bba1-e78a23a043b5-config-data\") pod \"nova-cell0-cell-mapping-nd9rf\" (UID: \"2af71207-973b-401b-bba1-e78a23a043b5\") " pod="openstack/nova-cell0-cell-mapping-nd9rf" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.232878 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2af71207-973b-401b-bba1-e78a23a043b5-scripts\") pod \"nova-cell0-cell-mapping-nd9rf\" (UID: \"2af71207-973b-401b-bba1-e78a23a043b5\") " pod="openstack/nova-cell0-cell-mapping-nd9rf" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.251111 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.252063 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.258848 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xrld\" (UniqueName: \"kubernetes.io/projected/2af71207-973b-401b-bba1-e78a23a043b5-kube-api-access-7xrld\") pod \"nova-cell0-cell-mapping-nd9rf\" (UID: \"2af71207-973b-401b-bba1-e78a23a043b5\") " pod="openstack/nova-cell0-cell-mapping-nd9rf" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.259012 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.278724 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.320639 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.320853 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnsj5\" (UniqueName: \"kubernetes.io/projected/55fec9de-b3a5-4a32-9572-c49018b331f7-kube-api-access-gnsj5\") pod \"nova-api-0\" (UID: \"55fec9de-b3a5-4a32-9572-c49018b331f7\") " pod="openstack/nova-api-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.320891 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b630beef-e3c4-4a29-bbc4-d43349b02284-config-data\") pod \"nova-metadata-0\" (UID: \"b630beef-e3c4-4a29-bbc4-d43349b02284\") " pod="openstack/nova-metadata-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.320934 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms7xw\" (UniqueName: \"kubernetes.io/projected/b630beef-e3c4-4a29-bbc4-d43349b02284-kube-api-access-ms7xw\") pod \"nova-metadata-0\" (UID: \"b630beef-e3c4-4a29-bbc4-d43349b02284\") " pod="openstack/nova-metadata-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.320959 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b630beef-e3c4-4a29-bbc4-d43349b02284-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b630beef-e3c4-4a29-bbc4-d43349b02284\") " pod="openstack/nova-metadata-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.320977 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55fec9de-b3a5-4a32-9572-c49018b331f7-logs\") pod \"nova-api-0\" (UID: \"55fec9de-b3a5-4a32-9572-c49018b331f7\") " pod="openstack/nova-api-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.320999 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55fec9de-b3a5-4a32-9572-c49018b331f7-config-data\") pod \"nova-api-0\" (UID: \"55fec9de-b3a5-4a32-9572-c49018b331f7\") " pod="openstack/nova-api-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.321078 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.321107 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55fec9de-b3a5-4a32-9572-c49018b331f7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"55fec9de-b3a5-4a32-9572-c49018b331f7\") " pod="openstack/nova-api-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.321127 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b630beef-e3c4-4a29-bbc4-d43349b02284-logs\") pod \"nova-metadata-0\" (UID: \"b630beef-e3c4-4a29-bbc4-d43349b02284\") " pod="openstack/nova-metadata-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.321258 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8lf5\" (UniqueName: \"kubernetes.io/projected/a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24-kube-api-access-f8lf5\") pod \"nova-cell1-novncproxy-0\" (UID: \"a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.321845 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55fec9de-b3a5-4a32-9572-c49018b331f7-logs\") pod \"nova-api-0\" (UID: \"55fec9de-b3a5-4a32-9572-c49018b331f7\") " pod="openstack/nova-api-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.322195 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d65b9d95f-ktpgz"] Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.323425 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.324431 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nd9rf" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.326127 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55fec9de-b3a5-4a32-9572-c49018b331f7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"55fec9de-b3a5-4a32-9572-c49018b331f7\") " pod="openstack/nova-api-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.326143 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55fec9de-b3a5-4a32-9572-c49018b331f7-config-data\") pod \"nova-api-0\" (UID: \"55fec9de-b3a5-4a32-9572-c49018b331f7\") " pod="openstack/nova-api-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.346457 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.347237 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnsj5\" (UniqueName: \"kubernetes.io/projected/55fec9de-b3a5-4a32-9572-c49018b331f7-kube-api-access-gnsj5\") pod \"nova-api-0\" (UID: \"55fec9de-b3a5-4a32-9572-c49018b331f7\") " pod="openstack/nova-api-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.347541 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.352280 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.358418 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d65b9d95f-ktpgz"] Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.368121 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.422457 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms7xw\" (UniqueName: \"kubernetes.io/projected/b630beef-e3c4-4a29-bbc4-d43349b02284-kube-api-access-ms7xw\") pod \"nova-metadata-0\" (UID: \"b630beef-e3c4-4a29-bbc4-d43349b02284\") " pod="openstack/nova-metadata-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.422505 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b630beef-e3c4-4a29-bbc4-d43349b02284-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b630beef-e3c4-4a29-bbc4-d43349b02284\") " pod="openstack/nova-metadata-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.422533 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d776183-8989-4482-8810-cf37b4a85669-config-data\") pod \"nova-scheduler-0\" (UID: \"8d776183-8989-4482-8810-cf37b4a85669\") " pod="openstack/nova-scheduler-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.422601 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph7tm\" (UniqueName: \"kubernetes.io/projected/8d776183-8989-4482-8810-cf37b4a85669-kube-api-access-ph7tm\") pod \"nova-scheduler-0\" (UID: \"8d776183-8989-4482-8810-cf37b4a85669\") " pod="openstack/nova-scheduler-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.422635 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nc5d\" (UniqueName: \"kubernetes.io/projected/a94b58fb-29b4-4b36-9229-748bb7705f42-kube-api-access-7nc5d\") pod \"dnsmasq-dns-5d65b9d95f-ktpgz\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.422660 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-dns-swift-storage-0\") pod \"dnsmasq-dns-5d65b9d95f-ktpgz\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.422709 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.422733 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d776183-8989-4482-8810-cf37b4a85669-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8d776183-8989-4482-8810-cf37b4a85669\") " pod="openstack/nova-scheduler-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.422755 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-ovsdbserver-sb\") pod \"dnsmasq-dns-5d65b9d95f-ktpgz\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.422787 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b630beef-e3c4-4a29-bbc4-d43349b02284-logs\") pod \"nova-metadata-0\" (UID: \"b630beef-e3c4-4a29-bbc4-d43349b02284\") " pod="openstack/nova-metadata-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.422808 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-config\") pod \"dnsmasq-dns-5d65b9d95f-ktpgz\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.422867 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8lf5\" (UniqueName: \"kubernetes.io/projected/a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24-kube-api-access-f8lf5\") pod \"nova-cell1-novncproxy-0\" (UID: \"a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.422891 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-dns-svc\") pod \"dnsmasq-dns-5d65b9d95f-ktpgz\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.422954 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.423017 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-ovsdbserver-nb\") pod \"dnsmasq-dns-5d65b9d95f-ktpgz\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.423048 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b630beef-e3c4-4a29-bbc4-d43349b02284-config-data\") pod \"nova-metadata-0\" (UID: \"b630beef-e3c4-4a29-bbc4-d43349b02284\") " pod="openstack/nova-metadata-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.430413 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b630beef-e3c4-4a29-bbc4-d43349b02284-logs\") pod \"nova-metadata-0\" (UID: \"b630beef-e3c4-4a29-bbc4-d43349b02284\") " pod="openstack/nova-metadata-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.431187 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.432168 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b630beef-e3c4-4a29-bbc4-d43349b02284-config-data\") pod \"nova-metadata-0\" (UID: \"b630beef-e3c4-4a29-bbc4-d43349b02284\") " pod="openstack/nova-metadata-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.432278 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.433215 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b630beef-e3c4-4a29-bbc4-d43349b02284-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b630beef-e3c4-4a29-bbc4-d43349b02284\") " pod="openstack/nova-metadata-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.450132 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.464098 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms7xw\" (UniqueName: \"kubernetes.io/projected/b630beef-e3c4-4a29-bbc4-d43349b02284-kube-api-access-ms7xw\") pod \"nova-metadata-0\" (UID: \"b630beef-e3c4-4a29-bbc4-d43349b02284\") " pod="openstack/nova-metadata-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.469059 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8lf5\" (UniqueName: \"kubernetes.io/projected/a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24-kube-api-access-f8lf5\") pod \"nova-cell1-novncproxy-0\" (UID: \"a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.516045 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.526411 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph7tm\" (UniqueName: \"kubernetes.io/projected/8d776183-8989-4482-8810-cf37b4a85669-kube-api-access-ph7tm\") pod \"nova-scheduler-0\" (UID: \"8d776183-8989-4482-8810-cf37b4a85669\") " pod="openstack/nova-scheduler-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.526799 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nc5d\" (UniqueName: \"kubernetes.io/projected/a94b58fb-29b4-4b36-9229-748bb7705f42-kube-api-access-7nc5d\") pod \"dnsmasq-dns-5d65b9d95f-ktpgz\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.526893 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-dns-swift-storage-0\") pod \"dnsmasq-dns-5d65b9d95f-ktpgz\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.526979 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d776183-8989-4482-8810-cf37b4a85669-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8d776183-8989-4482-8810-cf37b4a85669\") " pod="openstack/nova-scheduler-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.527044 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-ovsdbserver-sb\") pod \"dnsmasq-dns-5d65b9d95f-ktpgz\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.527104 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-config\") pod \"dnsmasq-dns-5d65b9d95f-ktpgz\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.527207 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-dns-svc\") pod \"dnsmasq-dns-5d65b9d95f-ktpgz\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.527303 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-ovsdbserver-nb\") pod \"dnsmasq-dns-5d65b9d95f-ktpgz\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.527392 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d776183-8989-4482-8810-cf37b4a85669-config-data\") pod \"nova-scheduler-0\" (UID: \"8d776183-8989-4482-8810-cf37b4a85669\") " pod="openstack/nova-scheduler-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.531384 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-ovsdbserver-sb\") pod \"dnsmasq-dns-5d65b9d95f-ktpgz\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.532118 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-config\") pod \"dnsmasq-dns-5d65b9d95f-ktpgz\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.532153 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-dns-swift-storage-0\") pod \"dnsmasq-dns-5d65b9d95f-ktpgz\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.532770 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-dns-svc\") pod \"dnsmasq-dns-5d65b9d95f-ktpgz\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.533281 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-ovsdbserver-nb\") pod \"dnsmasq-dns-5d65b9d95f-ktpgz\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.543189 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d776183-8989-4482-8810-cf37b4a85669-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8d776183-8989-4482-8810-cf37b4a85669\") " pod="openstack/nova-scheduler-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.544480 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d776183-8989-4482-8810-cf37b4a85669-config-data\") pod \"nova-scheduler-0\" (UID: \"8d776183-8989-4482-8810-cf37b4a85669\") " pod="openstack/nova-scheduler-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.546074 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph7tm\" (UniqueName: \"kubernetes.io/projected/8d776183-8989-4482-8810-cf37b4a85669-kube-api-access-ph7tm\") pod \"nova-scheduler-0\" (UID: \"8d776183-8989-4482-8810-cf37b4a85669\") " pod="openstack/nova-scheduler-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.550229 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.552145 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nc5d\" (UniqueName: \"kubernetes.io/projected/a94b58fb-29b4-4b36-9229-748bb7705f42-kube-api-access-7nc5d\") pod \"dnsmasq-dns-5d65b9d95f-ktpgz\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.600422 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.721027 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.755664 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.938232 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-crkmc"] Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.942586 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-crkmc" Oct 02 07:02:52 crc kubenswrapper[4786]: W1002 07:02:52.943187 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55fec9de_b3a5_4a32_9572_c49018b331f7.slice/crio-3af4bf2c3c46a31ce8804ff9b763de66743d932fa364eea656daf3e7a4e7cb67 WatchSource:0}: Error finding container 3af4bf2c3c46a31ce8804ff9b763de66743d932fa364eea656daf3e7a4e7cb67: Status 404 returned error can't find the container with id 3af4bf2c3c46a31ce8804ff9b763de66743d932fa364eea656daf3e7a4e7cb67 Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.944095 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.950442 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.955183 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-crkmc"] Oct 02 07:02:52 crc kubenswrapper[4786]: I1002 07:02:52.966322 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 07:02:53 crc kubenswrapper[4786]: W1002 07:02:53.026564 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2af71207_973b_401b_bba1_e78a23a043b5.slice/crio-b034663c31a1c52c9099e0b8dbbb27f689acebb941d3fb13d1358606b09796be WatchSource:0}: Error finding container b034663c31a1c52c9099e0b8dbbb27f689acebb941d3fb13d1358606b09796be: Status 404 returned error can't find the container with id b034663c31a1c52c9099e0b8dbbb27f689acebb941d3fb13d1358606b09796be Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.027346 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-nd9rf"] Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.034280 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.043097 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ckwf\" (UniqueName: \"kubernetes.io/projected/478c626d-63fa-4342-b4b7-59d09c6ce3c1-kube-api-access-4ckwf\") pod \"nova-cell1-conductor-db-sync-crkmc\" (UID: \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\") " pod="openstack/nova-cell1-conductor-db-sync-crkmc" Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.043177 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/478c626d-63fa-4342-b4b7-59d09c6ce3c1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-crkmc\" (UID: \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\") " pod="openstack/nova-cell1-conductor-db-sync-crkmc" Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.043196 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/478c626d-63fa-4342-b4b7-59d09c6ce3c1-config-data\") pod \"nova-cell1-conductor-db-sync-crkmc\" (UID: \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\") " pod="openstack/nova-cell1-conductor-db-sync-crkmc" Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.043217 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/478c626d-63fa-4342-b4b7-59d09c6ce3c1-scripts\") pod \"nova-cell1-conductor-db-sync-crkmc\" (UID: \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\") " pod="openstack/nova-cell1-conductor-db-sync-crkmc" Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.108519 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b630beef-e3c4-4a29-bbc4-d43349b02284","Type":"ContainerStarted","Data":"25eb2b8d346be3cc1913f4f9e1727b33575fdb3922ed64dcc52aeae292a42197"} Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.109391 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nd9rf" event={"ID":"2af71207-973b-401b-bba1-e78a23a043b5","Type":"ContainerStarted","Data":"b034663c31a1c52c9099e0b8dbbb27f689acebb941d3fb13d1358606b09796be"} Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.110581 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55fec9de-b3a5-4a32-9572-c49018b331f7","Type":"ContainerStarted","Data":"3af4bf2c3c46a31ce8804ff9b763de66743d932fa364eea656daf3e7a4e7cb67"} Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.134884 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 07:02:53 crc kubenswrapper[4786]: W1002 07:02:53.139781 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda80e1e53_545c_4e0c_a9f1_07cb2f3b7f24.slice/crio-575c71367a80cd7e6e82ec07a5c1ec09f34adcb17942c17ba7ee3aa04df6f943 WatchSource:0}: Error finding container 575c71367a80cd7e6e82ec07a5c1ec09f34adcb17942c17ba7ee3aa04df6f943: Status 404 returned error can't find the container with id 575c71367a80cd7e6e82ec07a5c1ec09f34adcb17942c17ba7ee3aa04df6f943 Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.144472 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ckwf\" (UniqueName: \"kubernetes.io/projected/478c626d-63fa-4342-b4b7-59d09c6ce3c1-kube-api-access-4ckwf\") pod \"nova-cell1-conductor-db-sync-crkmc\" (UID: \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\") " pod="openstack/nova-cell1-conductor-db-sync-crkmc" Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.144609 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/478c626d-63fa-4342-b4b7-59d09c6ce3c1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-crkmc\" (UID: \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\") " pod="openstack/nova-cell1-conductor-db-sync-crkmc" Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.144634 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/478c626d-63fa-4342-b4b7-59d09c6ce3c1-config-data\") pod \"nova-cell1-conductor-db-sync-crkmc\" (UID: \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\") " pod="openstack/nova-cell1-conductor-db-sync-crkmc" Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.144655 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/478c626d-63fa-4342-b4b7-59d09c6ce3c1-scripts\") pod \"nova-cell1-conductor-db-sync-crkmc\" (UID: \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\") " pod="openstack/nova-cell1-conductor-db-sync-crkmc" Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.147608 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d65b9d95f-ktpgz"] Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.148478 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/478c626d-63fa-4342-b4b7-59d09c6ce3c1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-crkmc\" (UID: \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\") " pod="openstack/nova-cell1-conductor-db-sync-crkmc" Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.149147 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/478c626d-63fa-4342-b4b7-59d09c6ce3c1-config-data\") pod \"nova-cell1-conductor-db-sync-crkmc\" (UID: \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\") " pod="openstack/nova-cell1-conductor-db-sync-crkmc" Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.151290 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/478c626d-63fa-4342-b4b7-59d09c6ce3c1-scripts\") pod \"nova-cell1-conductor-db-sync-crkmc\" (UID: \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\") " pod="openstack/nova-cell1-conductor-db-sync-crkmc" Oct 02 07:02:53 crc kubenswrapper[4786]: W1002 07:02:53.153398 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda94b58fb_29b4_4b36_9229_748bb7705f42.slice/crio-f5c329dda923382eafe51e1416084e6e170b5921e0755be4ae2332ee3f05819d WatchSource:0}: Error finding container f5c329dda923382eafe51e1416084e6e170b5921e0755be4ae2332ee3f05819d: Status 404 returned error can't find the container with id f5c329dda923382eafe51e1416084e6e170b5921e0755be4ae2332ee3f05819d Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.159627 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ckwf\" (UniqueName: \"kubernetes.io/projected/478c626d-63fa-4342-b4b7-59d09c6ce3c1-kube-api-access-4ckwf\") pod \"nova-cell1-conductor-db-sync-crkmc\" (UID: \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\") " pod="openstack/nova-cell1-conductor-db-sync-crkmc" Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.261883 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.271623 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-crkmc" Oct 02 07:02:53 crc kubenswrapper[4786]: I1002 07:02:53.651602 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-crkmc"] Oct 02 07:02:53 crc kubenswrapper[4786]: W1002 07:02:53.654390 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod478c626d_63fa_4342_b4b7_59d09c6ce3c1.slice/crio-1f89e6c035ed62a1f6d80650bc88843b67dea4494913d53f760305c661420b6c WatchSource:0}: Error finding container 1f89e6c035ed62a1f6d80650bc88843b67dea4494913d53f760305c661420b6c: Status 404 returned error can't find the container with id 1f89e6c035ed62a1f6d80650bc88843b67dea4494913d53f760305c661420b6c Oct 02 07:02:54 crc kubenswrapper[4786]: I1002 07:02:54.118037 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nd9rf" event={"ID":"2af71207-973b-401b-bba1-e78a23a043b5","Type":"ContainerStarted","Data":"6f1063672d6933be464f6d4561355dd1715d5dbefa2a1f18e31820d72bf3cfdd"} Oct 02 07:02:54 crc kubenswrapper[4786]: I1002 07:02:54.120128 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24","Type":"ContainerStarted","Data":"575c71367a80cd7e6e82ec07a5c1ec09f34adcb17942c17ba7ee3aa04df6f943"} Oct 02 07:02:54 crc kubenswrapper[4786]: I1002 07:02:54.122306 4786 generic.go:334] "Generic (PLEG): container finished" podID="a94b58fb-29b4-4b36-9229-748bb7705f42" containerID="f80e77db3f3238797f62a3409b81cf43d8e34b97ebf3a3f8b9ec1fb1c4012eb2" exitCode=0 Oct 02 07:02:54 crc kubenswrapper[4786]: I1002 07:02:54.122367 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" event={"ID":"a94b58fb-29b4-4b36-9229-748bb7705f42","Type":"ContainerDied","Data":"f80e77db3f3238797f62a3409b81cf43d8e34b97ebf3a3f8b9ec1fb1c4012eb2"} Oct 02 07:02:54 crc kubenswrapper[4786]: I1002 07:02:54.122384 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" event={"ID":"a94b58fb-29b4-4b36-9229-748bb7705f42","Type":"ContainerStarted","Data":"f5c329dda923382eafe51e1416084e6e170b5921e0755be4ae2332ee3f05819d"} Oct 02 07:02:54 crc kubenswrapper[4786]: I1002 07:02:54.125604 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-crkmc" event={"ID":"478c626d-63fa-4342-b4b7-59d09c6ce3c1","Type":"ContainerStarted","Data":"fd599f2df437a1bffba5b17e5812f2b05418d1811e5559c815e1a2ec3193895a"} Oct 02 07:02:54 crc kubenswrapper[4786]: I1002 07:02:54.125644 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-crkmc" event={"ID":"478c626d-63fa-4342-b4b7-59d09c6ce3c1","Type":"ContainerStarted","Data":"1f89e6c035ed62a1f6d80650bc88843b67dea4494913d53f760305c661420b6c"} Oct 02 07:02:54 crc kubenswrapper[4786]: I1002 07:02:54.127618 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8d776183-8989-4482-8810-cf37b4a85669","Type":"ContainerStarted","Data":"cecd40da4917e1bc99e301bdb8e8dcbae96e31169a9af9d7da1a38ffcebf0d18"} Oct 02 07:02:54 crc kubenswrapper[4786]: I1002 07:02:54.135172 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-nd9rf" podStartSLOduration=3.135156368 podStartE2EDuration="3.135156368s" podCreationTimestamp="2025-10-02 07:02:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:02:54.13217044 +0000 UTC m=+984.253353572" watchObservedRunningTime="2025-10-02 07:02:54.135156368 +0000 UTC m=+984.256339499" Oct 02 07:02:54 crc kubenswrapper[4786]: I1002 07:02:54.145122 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-crkmc" podStartSLOduration=2.145108268 podStartE2EDuration="2.145108268s" podCreationTimestamp="2025-10-02 07:02:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:02:54.144203342 +0000 UTC m=+984.265386483" watchObservedRunningTime="2025-10-02 07:02:54.145108268 +0000 UTC m=+984.266291399" Oct 02 07:02:54 crc kubenswrapper[4786]: I1002 07:02:54.582730 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mxt7n" Oct 02 07:02:54 crc kubenswrapper[4786]: I1002 07:02:54.632556 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mxt7n" Oct 02 07:02:55 crc kubenswrapper[4786]: I1002 07:02:55.143610 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" event={"ID":"a94b58fb-29b4-4b36-9229-748bb7705f42","Type":"ContainerStarted","Data":"bf6f200473e8d3242febc8646ed4764bc61bc89e058b9f4a8bcde3a255e34144"} Oct 02 07:02:55 crc kubenswrapper[4786]: I1002 07:02:55.145380 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:02:55 crc kubenswrapper[4786]: I1002 07:02:55.169074 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" podStartSLOduration=3.16905651 podStartE2EDuration="3.16905651s" podCreationTimestamp="2025-10-02 07:02:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:02:55.164280558 +0000 UTC m=+985.285463699" watchObservedRunningTime="2025-10-02 07:02:55.16905651 +0000 UTC m=+985.290239641" Oct 02 07:02:55 crc kubenswrapper[4786]: I1002 07:02:55.556639 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 07:02:55 crc kubenswrapper[4786]: I1002 07:02:55.566684 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 07:02:55 crc kubenswrapper[4786]: I1002 07:02:55.816819 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxt7n"] Oct 02 07:02:56 crc kubenswrapper[4786]: I1002 07:02:56.151021 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mxt7n" podUID="c108f77a-18c2-4814-a4f8-edfcc456e1ae" containerName="registry-server" containerID="cri-o://515db1b46e721b5ea1e42925d7dcfe483914e773d536d9dcb3459de963068b86" gracePeriod=2 Oct 02 07:02:56 crc kubenswrapper[4786]: I1002 07:02:56.538725 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxt7n" Oct 02 07:02:56 crc kubenswrapper[4786]: I1002 07:02:56.622192 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4fbq\" (UniqueName: \"kubernetes.io/projected/c108f77a-18c2-4814-a4f8-edfcc456e1ae-kube-api-access-t4fbq\") pod \"c108f77a-18c2-4814-a4f8-edfcc456e1ae\" (UID: \"c108f77a-18c2-4814-a4f8-edfcc456e1ae\") " Oct 02 07:02:56 crc kubenswrapper[4786]: I1002 07:02:56.622511 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c108f77a-18c2-4814-a4f8-edfcc456e1ae-catalog-content\") pod \"c108f77a-18c2-4814-a4f8-edfcc456e1ae\" (UID: \"c108f77a-18c2-4814-a4f8-edfcc456e1ae\") " Oct 02 07:02:56 crc kubenswrapper[4786]: I1002 07:02:56.622560 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c108f77a-18c2-4814-a4f8-edfcc456e1ae-utilities\") pod \"c108f77a-18c2-4814-a4f8-edfcc456e1ae\" (UID: \"c108f77a-18c2-4814-a4f8-edfcc456e1ae\") " Oct 02 07:02:56 crc kubenswrapper[4786]: I1002 07:02:56.623168 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c108f77a-18c2-4814-a4f8-edfcc456e1ae-utilities" (OuterVolumeSpecName: "utilities") pod "c108f77a-18c2-4814-a4f8-edfcc456e1ae" (UID: "c108f77a-18c2-4814-a4f8-edfcc456e1ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:02:56 crc kubenswrapper[4786]: I1002 07:02:56.626934 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c108f77a-18c2-4814-a4f8-edfcc456e1ae-kube-api-access-t4fbq" (OuterVolumeSpecName: "kube-api-access-t4fbq") pod "c108f77a-18c2-4814-a4f8-edfcc456e1ae" (UID: "c108f77a-18c2-4814-a4f8-edfcc456e1ae"). InnerVolumeSpecName "kube-api-access-t4fbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:02:56 crc kubenswrapper[4786]: I1002 07:02:56.670727 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c108f77a-18c2-4814-a4f8-edfcc456e1ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c108f77a-18c2-4814-a4f8-edfcc456e1ae" (UID: "c108f77a-18c2-4814-a4f8-edfcc456e1ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:02:56 crc kubenswrapper[4786]: I1002 07:02:56.725395 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c108f77a-18c2-4814-a4f8-edfcc456e1ae-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:56 crc kubenswrapper[4786]: I1002 07:02:56.725428 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c108f77a-18c2-4814-a4f8-edfcc456e1ae-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:56 crc kubenswrapper[4786]: I1002 07:02:56.725439 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4fbq\" (UniqueName: \"kubernetes.io/projected/c108f77a-18c2-4814-a4f8-edfcc456e1ae-kube-api-access-t4fbq\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.162199 4786 generic.go:334] "Generic (PLEG): container finished" podID="c108f77a-18c2-4814-a4f8-edfcc456e1ae" containerID="515db1b46e721b5ea1e42925d7dcfe483914e773d536d9dcb3459de963068b86" exitCode=0 Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.162250 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxt7n" Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.162292 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxt7n" event={"ID":"c108f77a-18c2-4814-a4f8-edfcc456e1ae","Type":"ContainerDied","Data":"515db1b46e721b5ea1e42925d7dcfe483914e773d536d9dcb3459de963068b86"} Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.162345 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxt7n" event={"ID":"c108f77a-18c2-4814-a4f8-edfcc456e1ae","Type":"ContainerDied","Data":"784a06f7da0bd51f761df61cb6cfddd7ee67c82ddbd786563cf5ef7f5ae4b434"} Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.162363 4786 scope.go:117] "RemoveContainer" containerID="515db1b46e721b5ea1e42925d7dcfe483914e773d536d9dcb3459de963068b86" Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.163927 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8d776183-8989-4482-8810-cf37b4a85669","Type":"ContainerStarted","Data":"09a81c234a0a01beaa914d4bb4690513c15b70508dfb354ade4c1265f9a82d49"} Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.165871 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24","Type":"ContainerStarted","Data":"b480c5c55f30afc7a2a1beb2ff5c5f3d8284b48fa75777e1fc445b594e1fe4d0"} Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.165953 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b480c5c55f30afc7a2a1beb2ff5c5f3d8284b48fa75777e1fc445b594e1fe4d0" gracePeriod=30 Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.169438 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55fec9de-b3a5-4a32-9572-c49018b331f7","Type":"ContainerStarted","Data":"2bf216d24284a723f93e4596a7fbed69f4a8d93dfdb7525c807596b7991dbcf7"} Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.169471 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55fec9de-b3a5-4a32-9572-c49018b331f7","Type":"ContainerStarted","Data":"bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7"} Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.179833 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.317760066 podStartE2EDuration="5.179823255s" podCreationTimestamp="2025-10-02 07:02:52 +0000 UTC" firstStartedPulling="2025-10-02 07:02:53.264770281 +0000 UTC m=+983.385953412" lastFinishedPulling="2025-10-02 07:02:56.126833469 +0000 UTC m=+986.248016601" observedRunningTime="2025-10-02 07:02:57.176888684 +0000 UTC m=+987.298071814" watchObservedRunningTime="2025-10-02 07:02:57.179823255 +0000 UTC m=+987.301006386" Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.187412 4786 scope.go:117] "RemoveContainer" containerID="1ed1846ddf260957b11c4239a5d71c2a2743e9ddbd2945c12a28de2960d019e3" Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.195389 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.224974476 podStartE2EDuration="5.195366915s" podCreationTimestamp="2025-10-02 07:02:52 +0000 UTC" firstStartedPulling="2025-10-02 07:02:53.146380085 +0000 UTC m=+983.267563216" lastFinishedPulling="2025-10-02 07:02:56.116772525 +0000 UTC m=+986.237955655" observedRunningTime="2025-10-02 07:02:57.189830859 +0000 UTC m=+987.311014010" watchObservedRunningTime="2025-10-02 07:02:57.195366915 +0000 UTC m=+987.316550045" Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.229823 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.06794765 podStartE2EDuration="5.229806434s" podCreationTimestamp="2025-10-02 07:02:52 +0000 UTC" firstStartedPulling="2025-10-02 07:02:52.951498042 +0000 UTC m=+983.072681173" lastFinishedPulling="2025-10-02 07:02:56.113356825 +0000 UTC m=+986.234539957" observedRunningTime="2025-10-02 07:02:57.208917868 +0000 UTC m=+987.330101009" watchObservedRunningTime="2025-10-02 07:02:57.229806434 +0000 UTC m=+987.350989564" Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.234330 4786 scope.go:117] "RemoveContainer" containerID="f323c27a27b58bf53d356f3ddedbc4fa894a4ab73e0f61816f1242114cc3cdb9" Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.234433 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxt7n"] Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.240199 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mxt7n"] Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.257869 4786 scope.go:117] "RemoveContainer" containerID="515db1b46e721b5ea1e42925d7dcfe483914e773d536d9dcb3459de963068b86" Oct 02 07:02:57 crc kubenswrapper[4786]: E1002 07:02:57.258948 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"515db1b46e721b5ea1e42925d7dcfe483914e773d536d9dcb3459de963068b86\": container with ID starting with 515db1b46e721b5ea1e42925d7dcfe483914e773d536d9dcb3459de963068b86 not found: ID does not exist" containerID="515db1b46e721b5ea1e42925d7dcfe483914e773d536d9dcb3459de963068b86" Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.259005 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"515db1b46e721b5ea1e42925d7dcfe483914e773d536d9dcb3459de963068b86"} err="failed to get container status \"515db1b46e721b5ea1e42925d7dcfe483914e773d536d9dcb3459de963068b86\": rpc error: code = NotFound desc = could not find container \"515db1b46e721b5ea1e42925d7dcfe483914e773d536d9dcb3459de963068b86\": container with ID starting with 515db1b46e721b5ea1e42925d7dcfe483914e773d536d9dcb3459de963068b86 not found: ID does not exist" Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.259042 4786 scope.go:117] "RemoveContainer" containerID="1ed1846ddf260957b11c4239a5d71c2a2743e9ddbd2945c12a28de2960d019e3" Oct 02 07:02:57 crc kubenswrapper[4786]: E1002 07:02:57.259464 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ed1846ddf260957b11c4239a5d71c2a2743e9ddbd2945c12a28de2960d019e3\": container with ID starting with 1ed1846ddf260957b11c4239a5d71c2a2743e9ddbd2945c12a28de2960d019e3 not found: ID does not exist" containerID="1ed1846ddf260957b11c4239a5d71c2a2743e9ddbd2945c12a28de2960d019e3" Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.259509 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ed1846ddf260957b11c4239a5d71c2a2743e9ddbd2945c12a28de2960d019e3"} err="failed to get container status \"1ed1846ddf260957b11c4239a5d71c2a2743e9ddbd2945c12a28de2960d019e3\": rpc error: code = NotFound desc = could not find container \"1ed1846ddf260957b11c4239a5d71c2a2743e9ddbd2945c12a28de2960d019e3\": container with ID starting with 1ed1846ddf260957b11c4239a5d71c2a2743e9ddbd2945c12a28de2960d019e3 not found: ID does not exist" Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.259547 4786 scope.go:117] "RemoveContainer" containerID="f323c27a27b58bf53d356f3ddedbc4fa894a4ab73e0f61816f1242114cc3cdb9" Oct 02 07:02:57 crc kubenswrapper[4786]: E1002 07:02:57.260758 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f323c27a27b58bf53d356f3ddedbc4fa894a4ab73e0f61816f1242114cc3cdb9\": container with ID starting with f323c27a27b58bf53d356f3ddedbc4fa894a4ab73e0f61816f1242114cc3cdb9 not found: ID does not exist" containerID="f323c27a27b58bf53d356f3ddedbc4fa894a4ab73e0f61816f1242114cc3cdb9" Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.260793 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f323c27a27b58bf53d356f3ddedbc4fa894a4ab73e0f61816f1242114cc3cdb9"} err="failed to get container status \"f323c27a27b58bf53d356f3ddedbc4fa894a4ab73e0f61816f1242114cc3cdb9\": rpc error: code = NotFound desc = could not find container \"f323c27a27b58bf53d356f3ddedbc4fa894a4ab73e0f61816f1242114cc3cdb9\": container with ID starting with f323c27a27b58bf53d356f3ddedbc4fa894a4ab73e0f61816f1242114cc3cdb9 not found: ID does not exist" Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.601031 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:02:57 crc kubenswrapper[4786]: I1002 07:02:57.757598 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 07:02:58 crc kubenswrapper[4786]: I1002 07:02:58.180991 4786 generic.go:334] "Generic (PLEG): container finished" podID="478c626d-63fa-4342-b4b7-59d09c6ce3c1" containerID="fd599f2df437a1bffba5b17e5812f2b05418d1811e5559c815e1a2ec3193895a" exitCode=0 Oct 02 07:02:58 crc kubenswrapper[4786]: I1002 07:02:58.187810 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c108f77a-18c2-4814-a4f8-edfcc456e1ae" path="/var/lib/kubelet/pods/c108f77a-18c2-4814-a4f8-edfcc456e1ae/volumes" Oct 02 07:02:58 crc kubenswrapper[4786]: I1002 07:02:58.188621 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-crkmc" event={"ID":"478c626d-63fa-4342-b4b7-59d09c6ce3c1","Type":"ContainerDied","Data":"fd599f2df437a1bffba5b17e5812f2b05418d1811e5559c815e1a2ec3193895a"} Oct 02 07:02:58 crc kubenswrapper[4786]: I1002 07:02:58.219034 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 07:02:58 crc kubenswrapper[4786]: I1002 07:02:58.219230 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="565ac7e2-35f2-4085-96f7-d6f78e14a4e2" containerName="kube-state-metrics" containerID="cri-o://3fa220d510a22483f1378b3290e8512a5b4feaaa433f861fb766fda6f9fafb08" gracePeriod=30 Oct 02 07:02:58 crc kubenswrapper[4786]: I1002 07:02:58.617735 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 07:02:58 crc kubenswrapper[4786]: I1002 07:02:58.661763 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvql2\" (UniqueName: \"kubernetes.io/projected/565ac7e2-35f2-4085-96f7-d6f78e14a4e2-kube-api-access-tvql2\") pod \"565ac7e2-35f2-4085-96f7-d6f78e14a4e2\" (UID: \"565ac7e2-35f2-4085-96f7-d6f78e14a4e2\") " Oct 02 07:02:58 crc kubenswrapper[4786]: I1002 07:02:58.672874 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/565ac7e2-35f2-4085-96f7-d6f78e14a4e2-kube-api-access-tvql2" (OuterVolumeSpecName: "kube-api-access-tvql2") pod "565ac7e2-35f2-4085-96f7-d6f78e14a4e2" (UID: "565ac7e2-35f2-4085-96f7-d6f78e14a4e2"). InnerVolumeSpecName "kube-api-access-tvql2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:02:58 crc kubenswrapper[4786]: I1002 07:02:58.764135 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvql2\" (UniqueName: \"kubernetes.io/projected/565ac7e2-35f2-4085-96f7-d6f78e14a4e2-kube-api-access-tvql2\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.192507 4786 generic.go:334] "Generic (PLEG): container finished" podID="565ac7e2-35f2-4085-96f7-d6f78e14a4e2" containerID="3fa220d510a22483f1378b3290e8512a5b4feaaa433f861fb766fda6f9fafb08" exitCode=2 Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.192633 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.193124 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"565ac7e2-35f2-4085-96f7-d6f78e14a4e2","Type":"ContainerDied","Data":"3fa220d510a22483f1378b3290e8512a5b4feaaa433f861fb766fda6f9fafb08"} Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.193151 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"565ac7e2-35f2-4085-96f7-d6f78e14a4e2","Type":"ContainerDied","Data":"b49d156d3571119acc3a0afaa2121ba79dcdb836c12d5aec7c4063e04cf7ce7e"} Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.193170 4786 scope.go:117] "RemoveContainer" containerID="3fa220d510a22483f1378b3290e8512a5b4feaaa433f861fb766fda6f9fafb08" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.194951 4786 generic.go:334] "Generic (PLEG): container finished" podID="2af71207-973b-401b-bba1-e78a23a043b5" containerID="6f1063672d6933be464f6d4561355dd1715d5dbefa2a1f18e31820d72bf3cfdd" exitCode=0 Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.195124 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nd9rf" event={"ID":"2af71207-973b-401b-bba1-e78a23a043b5","Type":"ContainerDied","Data":"6f1063672d6933be464f6d4561355dd1715d5dbefa2a1f18e31820d72bf3cfdd"} Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.217770 4786 scope.go:117] "RemoveContainer" containerID="3fa220d510a22483f1378b3290e8512a5b4feaaa433f861fb766fda6f9fafb08" Oct 02 07:02:59 crc kubenswrapper[4786]: E1002 07:02:59.218117 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fa220d510a22483f1378b3290e8512a5b4feaaa433f861fb766fda6f9fafb08\": container with ID starting with 3fa220d510a22483f1378b3290e8512a5b4feaaa433f861fb766fda6f9fafb08 not found: ID does not exist" containerID="3fa220d510a22483f1378b3290e8512a5b4feaaa433f861fb766fda6f9fafb08" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.218142 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fa220d510a22483f1378b3290e8512a5b4feaaa433f861fb766fda6f9fafb08"} err="failed to get container status \"3fa220d510a22483f1378b3290e8512a5b4feaaa433f861fb766fda6f9fafb08\": rpc error: code = NotFound desc = could not find container \"3fa220d510a22483f1378b3290e8512a5b4feaaa433f861fb766fda6f9fafb08\": container with ID starting with 3fa220d510a22483f1378b3290e8512a5b4feaaa433f861fb766fda6f9fafb08 not found: ID does not exist" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.228449 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.233321 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.247885 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 07:02:59 crc kubenswrapper[4786]: E1002 07:02:59.248236 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c108f77a-18c2-4814-a4f8-edfcc456e1ae" containerName="registry-server" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.248256 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c108f77a-18c2-4814-a4f8-edfcc456e1ae" containerName="registry-server" Oct 02 07:02:59 crc kubenswrapper[4786]: E1002 07:02:59.248265 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="565ac7e2-35f2-4085-96f7-d6f78e14a4e2" containerName="kube-state-metrics" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.248272 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="565ac7e2-35f2-4085-96f7-d6f78e14a4e2" containerName="kube-state-metrics" Oct 02 07:02:59 crc kubenswrapper[4786]: E1002 07:02:59.248283 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c108f77a-18c2-4814-a4f8-edfcc456e1ae" containerName="extract-content" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.248290 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c108f77a-18c2-4814-a4f8-edfcc456e1ae" containerName="extract-content" Oct 02 07:02:59 crc kubenswrapper[4786]: E1002 07:02:59.248302 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c108f77a-18c2-4814-a4f8-edfcc456e1ae" containerName="extract-utilities" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.248307 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c108f77a-18c2-4814-a4f8-edfcc456e1ae" containerName="extract-utilities" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.248506 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="565ac7e2-35f2-4085-96f7-d6f78e14a4e2" containerName="kube-state-metrics" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.248533 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c108f77a-18c2-4814-a4f8-edfcc456e1ae" containerName="registry-server" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.249161 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.251172 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.251316 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.258156 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.374871 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e002986c-0138-4b99-98a9-3d2095810fb4-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e002986c-0138-4b99-98a9-3d2095810fb4\") " pod="openstack/kube-state-metrics-0" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.375201 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmhgv\" (UniqueName: \"kubernetes.io/projected/e002986c-0138-4b99-98a9-3d2095810fb4-kube-api-access-hmhgv\") pod \"kube-state-metrics-0\" (UID: \"e002986c-0138-4b99-98a9-3d2095810fb4\") " pod="openstack/kube-state-metrics-0" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.375227 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e002986c-0138-4b99-98a9-3d2095810fb4-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e002986c-0138-4b99-98a9-3d2095810fb4\") " pod="openstack/kube-state-metrics-0" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.375275 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e002986c-0138-4b99-98a9-3d2095810fb4-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e002986c-0138-4b99-98a9-3d2095810fb4\") " pod="openstack/kube-state-metrics-0" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.478147 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmhgv\" (UniqueName: \"kubernetes.io/projected/e002986c-0138-4b99-98a9-3d2095810fb4-kube-api-access-hmhgv\") pod \"kube-state-metrics-0\" (UID: \"e002986c-0138-4b99-98a9-3d2095810fb4\") " pod="openstack/kube-state-metrics-0" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.478228 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e002986c-0138-4b99-98a9-3d2095810fb4-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e002986c-0138-4b99-98a9-3d2095810fb4\") " pod="openstack/kube-state-metrics-0" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.478406 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e002986c-0138-4b99-98a9-3d2095810fb4-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e002986c-0138-4b99-98a9-3d2095810fb4\") " pod="openstack/kube-state-metrics-0" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.478478 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e002986c-0138-4b99-98a9-3d2095810fb4-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e002986c-0138-4b99-98a9-3d2095810fb4\") " pod="openstack/kube-state-metrics-0" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.483377 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e002986c-0138-4b99-98a9-3d2095810fb4-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e002986c-0138-4b99-98a9-3d2095810fb4\") " pod="openstack/kube-state-metrics-0" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.483479 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e002986c-0138-4b99-98a9-3d2095810fb4-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e002986c-0138-4b99-98a9-3d2095810fb4\") " pod="openstack/kube-state-metrics-0" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.484236 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e002986c-0138-4b99-98a9-3d2095810fb4-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e002986c-0138-4b99-98a9-3d2095810fb4\") " pod="openstack/kube-state-metrics-0" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.492913 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmhgv\" (UniqueName: \"kubernetes.io/projected/e002986c-0138-4b99-98a9-3d2095810fb4-kube-api-access-hmhgv\") pod \"kube-state-metrics-0\" (UID: \"e002986c-0138-4b99-98a9-3d2095810fb4\") " pod="openstack/kube-state-metrics-0" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.538843 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-crkmc" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.567841 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.580650 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/478c626d-63fa-4342-b4b7-59d09c6ce3c1-scripts\") pod \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\" (UID: \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\") " Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.580993 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/478c626d-63fa-4342-b4b7-59d09c6ce3c1-config-data\") pod \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\" (UID: \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\") " Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.581077 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ckwf\" (UniqueName: \"kubernetes.io/projected/478c626d-63fa-4342-b4b7-59d09c6ce3c1-kube-api-access-4ckwf\") pod \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\" (UID: \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\") " Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.581165 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/478c626d-63fa-4342-b4b7-59d09c6ce3c1-combined-ca-bundle\") pod \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\" (UID: \"478c626d-63fa-4342-b4b7-59d09c6ce3c1\") " Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.584477 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/478c626d-63fa-4342-b4b7-59d09c6ce3c1-kube-api-access-4ckwf" (OuterVolumeSpecName: "kube-api-access-4ckwf") pod "478c626d-63fa-4342-b4b7-59d09c6ce3c1" (UID: "478c626d-63fa-4342-b4b7-59d09c6ce3c1"). InnerVolumeSpecName "kube-api-access-4ckwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.584820 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/478c626d-63fa-4342-b4b7-59d09c6ce3c1-scripts" (OuterVolumeSpecName: "scripts") pod "478c626d-63fa-4342-b4b7-59d09c6ce3c1" (UID: "478c626d-63fa-4342-b4b7-59d09c6ce3c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.605070 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/478c626d-63fa-4342-b4b7-59d09c6ce3c1-config-data" (OuterVolumeSpecName: "config-data") pod "478c626d-63fa-4342-b4b7-59d09c6ce3c1" (UID: "478c626d-63fa-4342-b4b7-59d09c6ce3c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.608663 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/478c626d-63fa-4342-b4b7-59d09c6ce3c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "478c626d-63fa-4342-b4b7-59d09c6ce3c1" (UID: "478c626d-63fa-4342-b4b7-59d09c6ce3c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.684514 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/478c626d-63fa-4342-b4b7-59d09c6ce3c1-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.684766 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/478c626d-63fa-4342-b4b7-59d09c6ce3c1-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.684780 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ckwf\" (UniqueName: \"kubernetes.io/projected/478c626d-63fa-4342-b4b7-59d09c6ce3c1-kube-api-access-4ckwf\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.684791 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/478c626d-63fa-4342-b4b7-59d09c6ce3c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.761336 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.761957 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerName="ceilometer-central-agent" containerID="cri-o://367d9e28f41fff6f9ef9091dab0be56a6f1742b9ba147ece3671dbaa45f15fe1" gracePeriod=30 Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.761989 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerName="proxy-httpd" containerID="cri-o://8e6199e159bd6a10e91dbeff8b9d3e1857df57445bcc7b1e602b9d937af3ba53" gracePeriod=30 Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.762079 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerName="sg-core" containerID="cri-o://af7338b7964d8b1058688ec65332fae43b53ef005bb2d07de01d1279c4bbb4b9" gracePeriod=30 Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.762120 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerName="ceilometer-notification-agent" containerID="cri-o://2a97632f1d6272d6d6ebec0eaa091587a9929aa2866b878c4dae6ca5eae9da17" gracePeriod=30 Oct 02 07:02:59 crc kubenswrapper[4786]: I1002 07:02:59.954739 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 07:02:59 crc kubenswrapper[4786]: W1002 07:02:59.958215 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode002986c_0138_4b99_98a9_3d2095810fb4.slice/crio-7c4c26b02c72c7e6aefea5c84a4cb6b7120c73c400f7342909f3b8ce51cb7a4a WatchSource:0}: Error finding container 7c4c26b02c72c7e6aefea5c84a4cb6b7120c73c400f7342909f3b8ce51cb7a4a: Status 404 returned error can't find the container with id 7c4c26b02c72c7e6aefea5c84a4cb6b7120c73c400f7342909f3b8ce51cb7a4a Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.187128 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="565ac7e2-35f2-4085-96f7-d6f78e14a4e2" path="/var/lib/kubelet/pods/565ac7e2-35f2-4085-96f7-d6f78e14a4e2/volumes" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.207906 4786 generic.go:334] "Generic (PLEG): container finished" podID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerID="8e6199e159bd6a10e91dbeff8b9d3e1857df57445bcc7b1e602b9d937af3ba53" exitCode=0 Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.207930 4786 generic.go:334] "Generic (PLEG): container finished" podID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerID="af7338b7964d8b1058688ec65332fae43b53ef005bb2d07de01d1279c4bbb4b9" exitCode=2 Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.207984 4786 generic.go:334] "Generic (PLEG): container finished" podID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerID="367d9e28f41fff6f9ef9091dab0be56a6f1742b9ba147ece3671dbaa45f15fe1" exitCode=0 Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.207957 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb","Type":"ContainerDied","Data":"8e6199e159bd6a10e91dbeff8b9d3e1857df57445bcc7b1e602b9d937af3ba53"} Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.208078 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb","Type":"ContainerDied","Data":"af7338b7964d8b1058688ec65332fae43b53ef005bb2d07de01d1279c4bbb4b9"} Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.208090 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb","Type":"ContainerDied","Data":"367d9e28f41fff6f9ef9091dab0be56a6f1742b9ba147ece3671dbaa45f15fe1"} Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.209762 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-crkmc" event={"ID":"478c626d-63fa-4342-b4b7-59d09c6ce3c1","Type":"ContainerDied","Data":"1f89e6c035ed62a1f6d80650bc88843b67dea4494913d53f760305c661420b6c"} Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.209882 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f89e6c035ed62a1f6d80650bc88843b67dea4494913d53f760305c661420b6c" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.209774 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-crkmc" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.210813 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e002986c-0138-4b99-98a9-3d2095810fb4","Type":"ContainerStarted","Data":"7c4c26b02c72c7e6aefea5c84a4cb6b7120c73c400f7342909f3b8ce51cb7a4a"} Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.268732 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 07:03:00 crc kubenswrapper[4786]: E1002 07:03:00.269136 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478c626d-63fa-4342-b4b7-59d09c6ce3c1" containerName="nova-cell1-conductor-db-sync" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.269155 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="478c626d-63fa-4342-b4b7-59d09c6ce3c1" containerName="nova-cell1-conductor-db-sync" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.269336 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="478c626d-63fa-4342-b4b7-59d09c6ce3c1" containerName="nova-cell1-conductor-db-sync" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.269935 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.272976 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.276816 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.396497 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c68bf5e-fcbd-48d6-bcfb-f36d68381c09-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9c68bf5e-fcbd-48d6-bcfb-f36d68381c09\") " pod="openstack/nova-cell1-conductor-0" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.396779 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c68bf5e-fcbd-48d6-bcfb-f36d68381c09-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9c68bf5e-fcbd-48d6-bcfb-f36d68381c09\") " pod="openstack/nova-cell1-conductor-0" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.396839 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b9fq\" (UniqueName: \"kubernetes.io/projected/9c68bf5e-fcbd-48d6-bcfb-f36d68381c09-kube-api-access-6b9fq\") pod \"nova-cell1-conductor-0\" (UID: \"9c68bf5e-fcbd-48d6-bcfb-f36d68381c09\") " pod="openstack/nova-cell1-conductor-0" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.498703 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c68bf5e-fcbd-48d6-bcfb-f36d68381c09-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9c68bf5e-fcbd-48d6-bcfb-f36d68381c09\") " pod="openstack/nova-cell1-conductor-0" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.499009 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b9fq\" (UniqueName: \"kubernetes.io/projected/9c68bf5e-fcbd-48d6-bcfb-f36d68381c09-kube-api-access-6b9fq\") pod \"nova-cell1-conductor-0\" (UID: \"9c68bf5e-fcbd-48d6-bcfb-f36d68381c09\") " pod="openstack/nova-cell1-conductor-0" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.499146 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c68bf5e-fcbd-48d6-bcfb-f36d68381c09-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9c68bf5e-fcbd-48d6-bcfb-f36d68381c09\") " pod="openstack/nova-cell1-conductor-0" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.504369 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c68bf5e-fcbd-48d6-bcfb-f36d68381c09-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9c68bf5e-fcbd-48d6-bcfb-f36d68381c09\") " pod="openstack/nova-cell1-conductor-0" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.504469 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c68bf5e-fcbd-48d6-bcfb-f36d68381c09-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9c68bf5e-fcbd-48d6-bcfb-f36d68381c09\") " pod="openstack/nova-cell1-conductor-0" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.514971 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b9fq\" (UniqueName: \"kubernetes.io/projected/9c68bf5e-fcbd-48d6-bcfb-f36d68381c09-kube-api-access-6b9fq\") pod \"nova-cell1-conductor-0\" (UID: \"9c68bf5e-fcbd-48d6-bcfb-f36d68381c09\") " pod="openstack/nova-cell1-conductor-0" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.580808 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nd9rf" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.587793 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.600451 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xrld\" (UniqueName: \"kubernetes.io/projected/2af71207-973b-401b-bba1-e78a23a043b5-kube-api-access-7xrld\") pod \"2af71207-973b-401b-bba1-e78a23a043b5\" (UID: \"2af71207-973b-401b-bba1-e78a23a043b5\") " Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.600532 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2af71207-973b-401b-bba1-e78a23a043b5-scripts\") pod \"2af71207-973b-401b-bba1-e78a23a043b5\" (UID: \"2af71207-973b-401b-bba1-e78a23a043b5\") " Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.600655 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af71207-973b-401b-bba1-e78a23a043b5-combined-ca-bundle\") pod \"2af71207-973b-401b-bba1-e78a23a043b5\" (UID: \"2af71207-973b-401b-bba1-e78a23a043b5\") " Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.600840 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2af71207-973b-401b-bba1-e78a23a043b5-config-data\") pod \"2af71207-973b-401b-bba1-e78a23a043b5\" (UID: \"2af71207-973b-401b-bba1-e78a23a043b5\") " Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.617327 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af71207-973b-401b-bba1-e78a23a043b5-scripts" (OuterVolumeSpecName: "scripts") pod "2af71207-973b-401b-bba1-e78a23a043b5" (UID: "2af71207-973b-401b-bba1-e78a23a043b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.626286 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af71207-973b-401b-bba1-e78a23a043b5-kube-api-access-7xrld" (OuterVolumeSpecName: "kube-api-access-7xrld") pod "2af71207-973b-401b-bba1-e78a23a043b5" (UID: "2af71207-973b-401b-bba1-e78a23a043b5"). InnerVolumeSpecName "kube-api-access-7xrld". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.646097 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af71207-973b-401b-bba1-e78a23a043b5-config-data" (OuterVolumeSpecName: "config-data") pod "2af71207-973b-401b-bba1-e78a23a043b5" (UID: "2af71207-973b-401b-bba1-e78a23a043b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.647160 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2af71207-973b-401b-bba1-e78a23a043b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2af71207-973b-401b-bba1-e78a23a043b5" (UID: "2af71207-973b-401b-bba1-e78a23a043b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.714648 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xrld\" (UniqueName: \"kubernetes.io/projected/2af71207-973b-401b-bba1-e78a23a043b5-kube-api-access-7xrld\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.714709 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2af71207-973b-401b-bba1-e78a23a043b5-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.714722 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af71207-973b-401b-bba1-e78a23a043b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:00 crc kubenswrapper[4786]: I1002 07:03:00.714732 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2af71207-973b-401b-bba1-e78a23a043b5-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.000618 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.221758 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e002986c-0138-4b99-98a9-3d2095810fb4","Type":"ContainerStarted","Data":"56561e837e4b718ca5580d341ccfd61920005494a710b5f2ffc4ef78ad15d9df"} Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.222062 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.224752 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-nd9rf" Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.224921 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-nd9rf" event={"ID":"2af71207-973b-401b-bba1-e78a23a043b5","Type":"ContainerDied","Data":"b034663c31a1c52c9099e0b8dbbb27f689acebb941d3fb13d1358606b09796be"} Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.224954 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b034663c31a1c52c9099e0b8dbbb27f689acebb941d3fb13d1358606b09796be" Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.226469 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9c68bf5e-fcbd-48d6-bcfb-f36d68381c09","Type":"ContainerStarted","Data":"5e61321fdde89f613044907d222424eb8b6626e4c7e25fd50c890dc24c8001e8"} Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.226511 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9c68bf5e-fcbd-48d6-bcfb-f36d68381c09","Type":"ContainerStarted","Data":"0dcddabeea579f4e16efdd073e6859fbc532f57ace83478727f4d190e8a46ec6"} Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.226813 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.243818 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.967909079 podStartE2EDuration="2.243803358s" podCreationTimestamp="2025-10-02 07:02:59 +0000 UTC" firstStartedPulling="2025-10-02 07:02:59.960466658 +0000 UTC m=+990.081649789" lastFinishedPulling="2025-10-02 07:03:00.236360937 +0000 UTC m=+990.357544068" observedRunningTime="2025-10-02 07:03:01.235994087 +0000 UTC m=+991.357177229" watchObservedRunningTime="2025-10-02 07:03:01.243803358 +0000 UTC m=+991.364986489" Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.259433 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.2594226499999999 podStartE2EDuration="1.25942265s" podCreationTimestamp="2025-10-02 07:03:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:03:01.257325277 +0000 UTC m=+991.378508418" watchObservedRunningTime="2025-10-02 07:03:01.25942265 +0000 UTC m=+991.380605781" Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.385272 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.385468 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="55fec9de-b3a5-4a32-9572-c49018b331f7" containerName="nova-api-log" containerID="cri-o://bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7" gracePeriod=30 Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.385611 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="55fec9de-b3a5-4a32-9572-c49018b331f7" containerName="nova-api-api" containerID="cri-o://2bf216d24284a723f93e4596a7fbed69f4a8d93dfdb7525c807596b7991dbcf7" gracePeriod=30 Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.399934 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.400167 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8d776183-8989-4482-8810-cf37b4a85669" containerName="nova-scheduler-scheduler" containerID="cri-o://09a81c234a0a01beaa914d4bb4690513c15b70508dfb354ade4c1265f9a82d49" gracePeriod=30 Oct 02 07:03:01 crc kubenswrapper[4786]: E1002 07:03:01.528791 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55fec9de_b3a5_4a32_9572_c49018b331f7.slice/crio-bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55fec9de_b3a5_4a32_9572_c49018b331f7.slice/crio-conmon-bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7.scope\": RecentStats: unable to find data in memory cache]" Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.846915 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.938931 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55fec9de-b3a5-4a32-9572-c49018b331f7-combined-ca-bundle\") pod \"55fec9de-b3a5-4a32-9572-c49018b331f7\" (UID: \"55fec9de-b3a5-4a32-9572-c49018b331f7\") " Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.939048 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55fec9de-b3a5-4a32-9572-c49018b331f7-logs\") pod \"55fec9de-b3a5-4a32-9572-c49018b331f7\" (UID: \"55fec9de-b3a5-4a32-9572-c49018b331f7\") " Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.939083 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnsj5\" (UniqueName: \"kubernetes.io/projected/55fec9de-b3a5-4a32-9572-c49018b331f7-kube-api-access-gnsj5\") pod \"55fec9de-b3a5-4a32-9572-c49018b331f7\" (UID: \"55fec9de-b3a5-4a32-9572-c49018b331f7\") " Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.939297 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55fec9de-b3a5-4a32-9572-c49018b331f7-config-data\") pod \"55fec9de-b3a5-4a32-9572-c49018b331f7\" (UID: \"55fec9de-b3a5-4a32-9572-c49018b331f7\") " Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.939500 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55fec9de-b3a5-4a32-9572-c49018b331f7-logs" (OuterVolumeSpecName: "logs") pod "55fec9de-b3a5-4a32-9572-c49018b331f7" (UID: "55fec9de-b3a5-4a32-9572-c49018b331f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.939884 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55fec9de-b3a5-4a32-9572-c49018b331f7-logs\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.944384 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55fec9de-b3a5-4a32-9572-c49018b331f7-kube-api-access-gnsj5" (OuterVolumeSpecName: "kube-api-access-gnsj5") pod "55fec9de-b3a5-4a32-9572-c49018b331f7" (UID: "55fec9de-b3a5-4a32-9572-c49018b331f7"). InnerVolumeSpecName "kube-api-access-gnsj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.962433 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55fec9de-b3a5-4a32-9572-c49018b331f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55fec9de-b3a5-4a32-9572-c49018b331f7" (UID: "55fec9de-b3a5-4a32-9572-c49018b331f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:01 crc kubenswrapper[4786]: I1002 07:03:01.962854 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55fec9de-b3a5-4a32-9572-c49018b331f7-config-data" (OuterVolumeSpecName: "config-data") pod "55fec9de-b3a5-4a32-9572-c49018b331f7" (UID: "55fec9de-b3a5-4a32-9572-c49018b331f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.041257 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnsj5\" (UniqueName: \"kubernetes.io/projected/55fec9de-b3a5-4a32-9572-c49018b331f7-kube-api-access-gnsj5\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.041288 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55fec9de-b3a5-4a32-9572-c49018b331f7-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.041301 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55fec9de-b3a5-4a32-9572-c49018b331f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.239851 4786 generic.go:334] "Generic (PLEG): container finished" podID="55fec9de-b3a5-4a32-9572-c49018b331f7" containerID="2bf216d24284a723f93e4596a7fbed69f4a8d93dfdb7525c807596b7991dbcf7" exitCode=0 Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.239883 4786 generic.go:334] "Generic (PLEG): container finished" podID="55fec9de-b3a5-4a32-9572-c49018b331f7" containerID="bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7" exitCode=143 Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.239886 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.239927 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55fec9de-b3a5-4a32-9572-c49018b331f7","Type":"ContainerDied","Data":"2bf216d24284a723f93e4596a7fbed69f4a8d93dfdb7525c807596b7991dbcf7"} Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.239953 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55fec9de-b3a5-4a32-9572-c49018b331f7","Type":"ContainerDied","Data":"bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7"} Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.239966 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"55fec9de-b3a5-4a32-9572-c49018b331f7","Type":"ContainerDied","Data":"3af4bf2c3c46a31ce8804ff9b763de66743d932fa364eea656daf3e7a4e7cb67"} Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.239980 4786 scope.go:117] "RemoveContainer" containerID="2bf216d24284a723f93e4596a7fbed69f4a8d93dfdb7525c807596b7991dbcf7" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.256405 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.275926 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.279119 4786 scope.go:117] "RemoveContainer" containerID="bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.286111 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 07:03:02 crc kubenswrapper[4786]: E1002 07:03:02.286536 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fec9de-b3a5-4a32-9572-c49018b331f7" containerName="nova-api-log" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.286616 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fec9de-b3a5-4a32-9572-c49018b331f7" containerName="nova-api-log" Oct 02 07:03:02 crc kubenswrapper[4786]: E1002 07:03:02.286680 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55fec9de-b3a5-4a32-9572-c49018b331f7" containerName="nova-api-api" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.286756 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="55fec9de-b3a5-4a32-9572-c49018b331f7" containerName="nova-api-api" Oct 02 07:03:02 crc kubenswrapper[4786]: E1002 07:03:02.286830 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af71207-973b-401b-bba1-e78a23a043b5" containerName="nova-manage" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.286880 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af71207-973b-401b-bba1-e78a23a043b5" containerName="nova-manage" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.287076 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fec9de-b3a5-4a32-9572-c49018b331f7" containerName="nova-api-api" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.287131 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af71207-973b-401b-bba1-e78a23a043b5" containerName="nova-manage" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.287185 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="55fec9de-b3a5-4a32-9572-c49018b331f7" containerName="nova-api-log" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.288065 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.289672 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.295094 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.310701 4786 scope.go:117] "RemoveContainer" containerID="2bf216d24284a723f93e4596a7fbed69f4a8d93dfdb7525c807596b7991dbcf7" Oct 02 07:03:02 crc kubenswrapper[4786]: E1002 07:03:02.311081 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bf216d24284a723f93e4596a7fbed69f4a8d93dfdb7525c807596b7991dbcf7\": container with ID starting with 2bf216d24284a723f93e4596a7fbed69f4a8d93dfdb7525c807596b7991dbcf7 not found: ID does not exist" containerID="2bf216d24284a723f93e4596a7fbed69f4a8d93dfdb7525c807596b7991dbcf7" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.311117 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bf216d24284a723f93e4596a7fbed69f4a8d93dfdb7525c807596b7991dbcf7"} err="failed to get container status \"2bf216d24284a723f93e4596a7fbed69f4a8d93dfdb7525c807596b7991dbcf7\": rpc error: code = NotFound desc = could not find container \"2bf216d24284a723f93e4596a7fbed69f4a8d93dfdb7525c807596b7991dbcf7\": container with ID starting with 2bf216d24284a723f93e4596a7fbed69f4a8d93dfdb7525c807596b7991dbcf7 not found: ID does not exist" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.311140 4786 scope.go:117] "RemoveContainer" containerID="bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7" Oct 02 07:03:02 crc kubenswrapper[4786]: E1002 07:03:02.311393 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7\": container with ID starting with bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7 not found: ID does not exist" containerID="bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.311426 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7"} err="failed to get container status \"bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7\": rpc error: code = NotFound desc = could not find container \"bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7\": container with ID starting with bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7 not found: ID does not exist" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.311454 4786 scope.go:117] "RemoveContainer" containerID="2bf216d24284a723f93e4596a7fbed69f4a8d93dfdb7525c807596b7991dbcf7" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.311854 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bf216d24284a723f93e4596a7fbed69f4a8d93dfdb7525c807596b7991dbcf7"} err="failed to get container status \"2bf216d24284a723f93e4596a7fbed69f4a8d93dfdb7525c807596b7991dbcf7\": rpc error: code = NotFound desc = could not find container \"2bf216d24284a723f93e4596a7fbed69f4a8d93dfdb7525c807596b7991dbcf7\": container with ID starting with 2bf216d24284a723f93e4596a7fbed69f4a8d93dfdb7525c807596b7991dbcf7 not found: ID does not exist" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.311890 4786 scope.go:117] "RemoveContainer" containerID="bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.312139 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7"} err="failed to get container status \"bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7\": rpc error: code = NotFound desc = could not find container \"bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7\": container with ID starting with bc23aa7b68e3eb08501cf18b09cf75169fb0c59f6d39bcf099452522fd7120d7 not found: ID does not exist" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.345829 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcd77\" (UniqueName: \"kubernetes.io/projected/3e85150e-5e6c-4676-8c0e-ed72db605bc0-kube-api-access-hcd77\") pod \"nova-api-0\" (UID: \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\") " pod="openstack/nova-api-0" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.345942 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e85150e-5e6c-4676-8c0e-ed72db605bc0-config-data\") pod \"nova-api-0\" (UID: \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\") " pod="openstack/nova-api-0" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.346041 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e85150e-5e6c-4676-8c0e-ed72db605bc0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\") " pod="openstack/nova-api-0" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.346367 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e85150e-5e6c-4676-8c0e-ed72db605bc0-logs\") pod \"nova-api-0\" (UID: \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\") " pod="openstack/nova-api-0" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.450410 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcd77\" (UniqueName: \"kubernetes.io/projected/3e85150e-5e6c-4676-8c0e-ed72db605bc0-kube-api-access-hcd77\") pod \"nova-api-0\" (UID: \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\") " pod="openstack/nova-api-0" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.450523 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e85150e-5e6c-4676-8c0e-ed72db605bc0-config-data\") pod \"nova-api-0\" (UID: \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\") " pod="openstack/nova-api-0" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.450564 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e85150e-5e6c-4676-8c0e-ed72db605bc0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\") " pod="openstack/nova-api-0" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.450626 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e85150e-5e6c-4676-8c0e-ed72db605bc0-logs\") pod \"nova-api-0\" (UID: \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\") " pod="openstack/nova-api-0" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.453340 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e85150e-5e6c-4676-8c0e-ed72db605bc0-logs\") pod \"nova-api-0\" (UID: \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\") " pod="openstack/nova-api-0" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.455229 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e85150e-5e6c-4676-8c0e-ed72db605bc0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\") " pod="openstack/nova-api-0" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.455484 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e85150e-5e6c-4676-8c0e-ed72db605bc0-config-data\") pod \"nova-api-0\" (UID: \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\") " pod="openstack/nova-api-0" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.484271 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcd77\" (UniqueName: \"kubernetes.io/projected/3e85150e-5e6c-4676-8c0e-ed72db605bc0-kube-api-access-hcd77\") pod \"nova-api-0\" (UID: \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\") " pod="openstack/nova-api-0" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.610488 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.722877 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.792608 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64d9b8cc-868kq"] Oct 02 07:03:02 crc kubenswrapper[4786]: I1002 07:03:02.793544 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64d9b8cc-868kq" podUID="aecc9aef-011b-45c9-8e92-0c41524ab191" containerName="dnsmasq-dns" containerID="cri-o://fe33dd8388163aa396dde7e46baad4fac4dcdbfbb62f9cadc7a1a1dcfb291122" gracePeriod=10 Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.048014 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 07:03:03 crc kubenswrapper[4786]: W1002 07:03:03.094868 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e85150e_5e6c_4676_8c0e_ed72db605bc0.slice/crio-9154fe59e610a917704d6f03410c8175a5d7dc15f8c1b05cb36ffb8b25c80303 WatchSource:0}: Error finding container 9154fe59e610a917704d6f03410c8175a5d7dc15f8c1b05cb36ffb8b25c80303: Status 404 returned error can't find the container with id 9154fe59e610a917704d6f03410c8175a5d7dc15f8c1b05cb36ffb8b25c80303 Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.176537 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.250017 4786 generic.go:334] "Generic (PLEG): container finished" podID="8d776183-8989-4482-8810-cf37b4a85669" containerID="09a81c234a0a01beaa914d4bb4690513c15b70508dfb354ade4c1265f9a82d49" exitCode=0 Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.250086 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8d776183-8989-4482-8810-cf37b4a85669","Type":"ContainerDied","Data":"09a81c234a0a01beaa914d4bb4690513c15b70508dfb354ade4c1265f9a82d49"} Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.254311 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e85150e-5e6c-4676-8c0e-ed72db605bc0","Type":"ContainerStarted","Data":"48c7e7fc5707caa14b320272d7ef7901ccd001bc7434c05a48cf01952952e9d4"} Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.254349 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e85150e-5e6c-4676-8c0e-ed72db605bc0","Type":"ContainerStarted","Data":"9154fe59e610a917704d6f03410c8175a5d7dc15f8c1b05cb36ffb8b25c80303"} Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.258041 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b630beef-e3c4-4a29-bbc4-d43349b02284","Type":"ContainerStarted","Data":"1a2d1acb311894b8af304077158e5f14503f32481c19b45d5e594651901dab10"} Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.258125 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b630beef-e3c4-4a29-bbc4-d43349b02284","Type":"ContainerStarted","Data":"04f0527c9d11607c6c8ecbce6905c23b97bd59be9ff1114e19637958d9d0e75b"} Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.258145 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b630beef-e3c4-4a29-bbc4-d43349b02284" containerName="nova-metadata-log" containerID="cri-o://04f0527c9d11607c6c8ecbce6905c23b97bd59be9ff1114e19637958d9d0e75b" gracePeriod=30 Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.258244 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b630beef-e3c4-4a29-bbc4-d43349b02284" containerName="nova-metadata-metadata" containerID="cri-o://1a2d1acb311894b8af304077158e5f14503f32481c19b45d5e594651901dab10" gracePeriod=30 Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.269214 4786 generic.go:334] "Generic (PLEG): container finished" podID="aecc9aef-011b-45c9-8e92-0c41524ab191" containerID="fe33dd8388163aa396dde7e46baad4fac4dcdbfbb62f9cadc7a1a1dcfb291122" exitCode=0 Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.269279 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64d9b8cc-868kq" event={"ID":"aecc9aef-011b-45c9-8e92-0c41524ab191","Type":"ContainerDied","Data":"fe33dd8388163aa396dde7e46baad4fac4dcdbfbb62f9cadc7a1a1dcfb291122"} Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.269301 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64d9b8cc-868kq" event={"ID":"aecc9aef-011b-45c9-8e92-0c41524ab191","Type":"ContainerDied","Data":"cee2b1ed022a47a6e3fce3f86034c67d93f341f5978bd7752ccd318169ff6cc3"} Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.269319 4786 scope.go:117] "RemoveContainer" containerID="fe33dd8388163aa396dde7e46baad4fac4dcdbfbb62f9cadc7a1a1dcfb291122" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.269435 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64d9b8cc-868kq" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.282077 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-ovsdbserver-sb\") pod \"aecc9aef-011b-45c9-8e92-0c41524ab191\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.282263 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-ovsdbserver-nb\") pod \"aecc9aef-011b-45c9-8e92-0c41524ab191\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.282417 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7zvr\" (UniqueName: \"kubernetes.io/projected/aecc9aef-011b-45c9-8e92-0c41524ab191-kube-api-access-d7zvr\") pod \"aecc9aef-011b-45c9-8e92-0c41524ab191\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.282451 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-dns-swift-storage-0\") pod \"aecc9aef-011b-45c9-8e92-0c41524ab191\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.282531 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-config\") pod \"aecc9aef-011b-45c9-8e92-0c41524ab191\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.282651 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-dns-svc\") pod \"aecc9aef-011b-45c9-8e92-0c41524ab191\" (UID: \"aecc9aef-011b-45c9-8e92-0c41524ab191\") " Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.295767 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.830359837 podStartE2EDuration="11.295745949s" podCreationTimestamp="2025-10-02 07:02:52 +0000 UTC" firstStartedPulling="2025-10-02 07:02:53.026666014 +0000 UTC m=+983.147849146" lastFinishedPulling="2025-10-02 07:03:02.492052127 +0000 UTC m=+992.613235258" observedRunningTime="2025-10-02 07:03:03.284938066 +0000 UTC m=+993.406121207" watchObservedRunningTime="2025-10-02 07:03:03.295745949 +0000 UTC m=+993.416929080" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.303643 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aecc9aef-011b-45c9-8e92-0c41524ab191-kube-api-access-d7zvr" (OuterVolumeSpecName: "kube-api-access-d7zvr") pod "aecc9aef-011b-45c9-8e92-0c41524ab191" (UID: "aecc9aef-011b-45c9-8e92-0c41524ab191"). InnerVolumeSpecName "kube-api-access-d7zvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.313164 4786 scope.go:117] "RemoveContainer" containerID="d7dcf9a63e95a7e46193a7f83d27b9cadfb29e7d8c4a8995e9be8d361c6c571c" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.326824 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aecc9aef-011b-45c9-8e92-0c41524ab191" (UID: "aecc9aef-011b-45c9-8e92-0c41524ab191"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.333893 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aecc9aef-011b-45c9-8e92-0c41524ab191" (UID: "aecc9aef-011b-45c9-8e92-0c41524ab191"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.343007 4786 scope.go:117] "RemoveContainer" containerID="fe33dd8388163aa396dde7e46baad4fac4dcdbfbb62f9cadc7a1a1dcfb291122" Oct 02 07:03:03 crc kubenswrapper[4786]: E1002 07:03:03.343339 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe33dd8388163aa396dde7e46baad4fac4dcdbfbb62f9cadc7a1a1dcfb291122\": container with ID starting with fe33dd8388163aa396dde7e46baad4fac4dcdbfbb62f9cadc7a1a1dcfb291122 not found: ID does not exist" containerID="fe33dd8388163aa396dde7e46baad4fac4dcdbfbb62f9cadc7a1a1dcfb291122" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.343377 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe33dd8388163aa396dde7e46baad4fac4dcdbfbb62f9cadc7a1a1dcfb291122"} err="failed to get container status \"fe33dd8388163aa396dde7e46baad4fac4dcdbfbb62f9cadc7a1a1dcfb291122\": rpc error: code = NotFound desc = could not find container \"fe33dd8388163aa396dde7e46baad4fac4dcdbfbb62f9cadc7a1a1dcfb291122\": container with ID starting with fe33dd8388163aa396dde7e46baad4fac4dcdbfbb62f9cadc7a1a1dcfb291122 not found: ID does not exist" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.343416 4786 scope.go:117] "RemoveContainer" containerID="d7dcf9a63e95a7e46193a7f83d27b9cadfb29e7d8c4a8995e9be8d361c6c571c" Oct 02 07:03:03 crc kubenswrapper[4786]: E1002 07:03:03.343757 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7dcf9a63e95a7e46193a7f83d27b9cadfb29e7d8c4a8995e9be8d361c6c571c\": container with ID starting with d7dcf9a63e95a7e46193a7f83d27b9cadfb29e7d8c4a8995e9be8d361c6c571c not found: ID does not exist" containerID="d7dcf9a63e95a7e46193a7f83d27b9cadfb29e7d8c4a8995e9be8d361c6c571c" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.343803 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7dcf9a63e95a7e46193a7f83d27b9cadfb29e7d8c4a8995e9be8d361c6c571c"} err="failed to get container status \"d7dcf9a63e95a7e46193a7f83d27b9cadfb29e7d8c4a8995e9be8d361c6c571c\": rpc error: code = NotFound desc = could not find container \"d7dcf9a63e95a7e46193a7f83d27b9cadfb29e7d8c4a8995e9be8d361c6c571c\": container with ID starting with d7dcf9a63e95a7e46193a7f83d27b9cadfb29e7d8c4a8995e9be8d361c6c571c not found: ID does not exist" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.344201 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aecc9aef-011b-45c9-8e92-0c41524ab191" (UID: "aecc9aef-011b-45c9-8e92-0c41524ab191"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.345290 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aecc9aef-011b-45c9-8e92-0c41524ab191" (UID: "aecc9aef-011b-45c9-8e92-0c41524ab191"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.353905 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-config" (OuterVolumeSpecName: "config") pod "aecc9aef-011b-45c9-8e92-0c41524ab191" (UID: "aecc9aef-011b-45c9-8e92-0c41524ab191"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.374497 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.387927 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.387949 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.387959 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7zvr\" (UniqueName: \"kubernetes.io/projected/aecc9aef-011b-45c9-8e92-0c41524ab191-kube-api-access-d7zvr\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.387970 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.387980 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.387988 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aecc9aef-011b-45c9-8e92-0c41524ab191-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.488711 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph7tm\" (UniqueName: \"kubernetes.io/projected/8d776183-8989-4482-8810-cf37b4a85669-kube-api-access-ph7tm\") pod \"8d776183-8989-4482-8810-cf37b4a85669\" (UID: \"8d776183-8989-4482-8810-cf37b4a85669\") " Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.488950 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d776183-8989-4482-8810-cf37b4a85669-config-data\") pod \"8d776183-8989-4482-8810-cf37b4a85669\" (UID: \"8d776183-8989-4482-8810-cf37b4a85669\") " Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.489053 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d776183-8989-4482-8810-cf37b4a85669-combined-ca-bundle\") pod \"8d776183-8989-4482-8810-cf37b4a85669\" (UID: \"8d776183-8989-4482-8810-cf37b4a85669\") " Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.491388 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d776183-8989-4482-8810-cf37b4a85669-kube-api-access-ph7tm" (OuterVolumeSpecName: "kube-api-access-ph7tm") pod "8d776183-8989-4482-8810-cf37b4a85669" (UID: "8d776183-8989-4482-8810-cf37b4a85669"). InnerVolumeSpecName "kube-api-access-ph7tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.511368 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d776183-8989-4482-8810-cf37b4a85669-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d776183-8989-4482-8810-cf37b4a85669" (UID: "8d776183-8989-4482-8810-cf37b4a85669"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.518563 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d776183-8989-4482-8810-cf37b4a85669-config-data" (OuterVolumeSpecName: "config-data") pod "8d776183-8989-4482-8810-cf37b4a85669" (UID: "8d776183-8989-4482-8810-cf37b4a85669"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.598775 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d776183-8989-4482-8810-cf37b4a85669-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.598796 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph7tm\" (UniqueName: \"kubernetes.io/projected/8d776183-8989-4482-8810-cf37b4a85669-kube-api-access-ph7tm\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.598806 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d776183-8989-4482-8810-cf37b4a85669-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.611089 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64d9b8cc-868kq"] Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.617446 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64d9b8cc-868kq"] Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.777471 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.901929 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b630beef-e3c4-4a29-bbc4-d43349b02284-logs\") pod \"b630beef-e3c4-4a29-bbc4-d43349b02284\" (UID: \"b630beef-e3c4-4a29-bbc4-d43349b02284\") " Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.902012 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b630beef-e3c4-4a29-bbc4-d43349b02284-combined-ca-bundle\") pod \"b630beef-e3c4-4a29-bbc4-d43349b02284\" (UID: \"b630beef-e3c4-4a29-bbc4-d43349b02284\") " Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.902154 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b630beef-e3c4-4a29-bbc4-d43349b02284-config-data\") pod \"b630beef-e3c4-4a29-bbc4-d43349b02284\" (UID: \"b630beef-e3c4-4a29-bbc4-d43349b02284\") " Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.902277 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms7xw\" (UniqueName: \"kubernetes.io/projected/b630beef-e3c4-4a29-bbc4-d43349b02284-kube-api-access-ms7xw\") pod \"b630beef-e3c4-4a29-bbc4-d43349b02284\" (UID: \"b630beef-e3c4-4a29-bbc4-d43349b02284\") " Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.902421 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b630beef-e3c4-4a29-bbc4-d43349b02284-logs" (OuterVolumeSpecName: "logs") pod "b630beef-e3c4-4a29-bbc4-d43349b02284" (UID: "b630beef-e3c4-4a29-bbc4-d43349b02284"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.902955 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b630beef-e3c4-4a29-bbc4-d43349b02284-logs\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.912800 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b630beef-e3c4-4a29-bbc4-d43349b02284-kube-api-access-ms7xw" (OuterVolumeSpecName: "kube-api-access-ms7xw") pod "b630beef-e3c4-4a29-bbc4-d43349b02284" (UID: "b630beef-e3c4-4a29-bbc4-d43349b02284"). InnerVolumeSpecName "kube-api-access-ms7xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.923955 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b630beef-e3c4-4a29-bbc4-d43349b02284-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b630beef-e3c4-4a29-bbc4-d43349b02284" (UID: "b630beef-e3c4-4a29-bbc4-d43349b02284"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:03 crc kubenswrapper[4786]: I1002 07:03:03.924493 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b630beef-e3c4-4a29-bbc4-d43349b02284-config-data" (OuterVolumeSpecName: "config-data") pod "b630beef-e3c4-4a29-bbc4-d43349b02284" (UID: "b630beef-e3c4-4a29-bbc4-d43349b02284"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.004571 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms7xw\" (UniqueName: \"kubernetes.io/projected/b630beef-e3c4-4a29-bbc4-d43349b02284-kube-api-access-ms7xw\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.004614 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b630beef-e3c4-4a29-bbc4-d43349b02284-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.004623 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b630beef-e3c4-4a29-bbc4-d43349b02284-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.187671 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55fec9de-b3a5-4a32-9572-c49018b331f7" path="/var/lib/kubelet/pods/55fec9de-b3a5-4a32-9572-c49018b331f7/volumes" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.188685 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aecc9aef-011b-45c9-8e92-0c41524ab191" path="/var/lib/kubelet/pods/aecc9aef-011b-45c9-8e92-0c41524ab191/volumes" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.284908 4786 generic.go:334] "Generic (PLEG): container finished" podID="b630beef-e3c4-4a29-bbc4-d43349b02284" containerID="1a2d1acb311894b8af304077158e5f14503f32481c19b45d5e594651901dab10" exitCode=0 Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.284935 4786 generic.go:334] "Generic (PLEG): container finished" podID="b630beef-e3c4-4a29-bbc4-d43349b02284" containerID="04f0527c9d11607c6c8ecbce6905c23b97bd59be9ff1114e19637958d9d0e75b" exitCode=143 Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.284960 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.284998 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b630beef-e3c4-4a29-bbc4-d43349b02284","Type":"ContainerDied","Data":"1a2d1acb311894b8af304077158e5f14503f32481c19b45d5e594651901dab10"} Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.285046 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b630beef-e3c4-4a29-bbc4-d43349b02284","Type":"ContainerDied","Data":"04f0527c9d11607c6c8ecbce6905c23b97bd59be9ff1114e19637958d9d0e75b"} Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.285057 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b630beef-e3c4-4a29-bbc4-d43349b02284","Type":"ContainerDied","Data":"25eb2b8d346be3cc1913f4f9e1727b33575fdb3922ed64dcc52aeae292a42197"} Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.285074 4786 scope.go:117] "RemoveContainer" containerID="1a2d1acb311894b8af304077158e5f14503f32481c19b45d5e594651901dab10" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.289753 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8d776183-8989-4482-8810-cf37b4a85669","Type":"ContainerDied","Data":"cecd40da4917e1bc99e301bdb8e8dcbae96e31169a9af9d7da1a38ffcebf0d18"} Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.289773 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.292200 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e85150e-5e6c-4676-8c0e-ed72db605bc0","Type":"ContainerStarted","Data":"e86c3ba991ee23e50aa4f16eb8a1014ca0fdb2d83ba5407fd4ac6df316fc9542"} Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.305822 4786 scope.go:117] "RemoveContainer" containerID="04f0527c9d11607c6c8ecbce6905c23b97bd59be9ff1114e19637958d9d0e75b" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.308793 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.316202 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.329205 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 07:03:04 crc kubenswrapper[4786]: E1002 07:03:04.329559 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b630beef-e3c4-4a29-bbc4-d43349b02284" containerName="nova-metadata-metadata" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.329576 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b630beef-e3c4-4a29-bbc4-d43349b02284" containerName="nova-metadata-metadata" Oct 02 07:03:04 crc kubenswrapper[4786]: E1002 07:03:04.329604 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aecc9aef-011b-45c9-8e92-0c41524ab191" containerName="init" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.329612 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="aecc9aef-011b-45c9-8e92-0c41524ab191" containerName="init" Oct 02 07:03:04 crc kubenswrapper[4786]: E1002 07:03:04.329626 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b630beef-e3c4-4a29-bbc4-d43349b02284" containerName="nova-metadata-log" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.329639 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b630beef-e3c4-4a29-bbc4-d43349b02284" containerName="nova-metadata-log" Oct 02 07:03:04 crc kubenswrapper[4786]: E1002 07:03:04.329654 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aecc9aef-011b-45c9-8e92-0c41524ab191" containerName="dnsmasq-dns" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.329660 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="aecc9aef-011b-45c9-8e92-0c41524ab191" containerName="dnsmasq-dns" Oct 02 07:03:04 crc kubenswrapper[4786]: E1002 07:03:04.329673 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d776183-8989-4482-8810-cf37b4a85669" containerName="nova-scheduler-scheduler" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.329681 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d776183-8989-4482-8810-cf37b4a85669" containerName="nova-scheduler-scheduler" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.329893 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b630beef-e3c4-4a29-bbc4-d43349b02284" containerName="nova-metadata-log" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.329907 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b630beef-e3c4-4a29-bbc4-d43349b02284" containerName="nova-metadata-metadata" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.329918 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="aecc9aef-011b-45c9-8e92-0c41524ab191" containerName="dnsmasq-dns" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.329934 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d776183-8989-4482-8810-cf37b4a85669" containerName="nova-scheduler-scheduler" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.330778 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.336015 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.336997 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.341557 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.341541724 podStartE2EDuration="2.341541724s" podCreationTimestamp="2025-10-02 07:03:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:03:04.322094456 +0000 UTC m=+994.443277587" watchObservedRunningTime="2025-10-02 07:03:04.341541724 +0000 UTC m=+994.462724855" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.354413 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.369735 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.379738 4786 scope.go:117] "RemoveContainer" containerID="1a2d1acb311894b8af304077158e5f14503f32481c19b45d5e594651901dab10" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.380569 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 07:03:04 crc kubenswrapper[4786]: E1002 07:03:04.380836 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a2d1acb311894b8af304077158e5f14503f32481c19b45d5e594651901dab10\": container with ID starting with 1a2d1acb311894b8af304077158e5f14503f32481c19b45d5e594651901dab10 not found: ID does not exist" containerID="1a2d1acb311894b8af304077158e5f14503f32481c19b45d5e594651901dab10" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.380873 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a2d1acb311894b8af304077158e5f14503f32481c19b45d5e594651901dab10"} err="failed to get container status \"1a2d1acb311894b8af304077158e5f14503f32481c19b45d5e594651901dab10\": rpc error: code = NotFound desc = could not find container \"1a2d1acb311894b8af304077158e5f14503f32481c19b45d5e594651901dab10\": container with ID starting with 1a2d1acb311894b8af304077158e5f14503f32481c19b45d5e594651901dab10 not found: ID does not exist" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.380896 4786 scope.go:117] "RemoveContainer" containerID="04f0527c9d11607c6c8ecbce6905c23b97bd59be9ff1114e19637958d9d0e75b" Oct 02 07:03:04 crc kubenswrapper[4786]: E1002 07:03:04.381784 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04f0527c9d11607c6c8ecbce6905c23b97bd59be9ff1114e19637958d9d0e75b\": container with ID starting with 04f0527c9d11607c6c8ecbce6905c23b97bd59be9ff1114e19637958d9d0e75b not found: ID does not exist" containerID="04f0527c9d11607c6c8ecbce6905c23b97bd59be9ff1114e19637958d9d0e75b" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.381811 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f0527c9d11607c6c8ecbce6905c23b97bd59be9ff1114e19637958d9d0e75b"} err="failed to get container status \"04f0527c9d11607c6c8ecbce6905c23b97bd59be9ff1114e19637958d9d0e75b\": rpc error: code = NotFound desc = could not find container \"04f0527c9d11607c6c8ecbce6905c23b97bd59be9ff1114e19637958d9d0e75b\": container with ID starting with 04f0527c9d11607c6c8ecbce6905c23b97bd59be9ff1114e19637958d9d0e75b not found: ID does not exist" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.381827 4786 scope.go:117] "RemoveContainer" containerID="1a2d1acb311894b8af304077158e5f14503f32481c19b45d5e594651901dab10" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.382153 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a2d1acb311894b8af304077158e5f14503f32481c19b45d5e594651901dab10"} err="failed to get container status \"1a2d1acb311894b8af304077158e5f14503f32481c19b45d5e594651901dab10\": rpc error: code = NotFound desc = could not find container \"1a2d1acb311894b8af304077158e5f14503f32481c19b45d5e594651901dab10\": container with ID starting with 1a2d1acb311894b8af304077158e5f14503f32481c19b45d5e594651901dab10 not found: ID does not exist" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.382173 4786 scope.go:117] "RemoveContainer" containerID="04f0527c9d11607c6c8ecbce6905c23b97bd59be9ff1114e19637958d9d0e75b" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.382431 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f0527c9d11607c6c8ecbce6905c23b97bd59be9ff1114e19637958d9d0e75b"} err="failed to get container status \"04f0527c9d11607c6c8ecbce6905c23b97bd59be9ff1114e19637958d9d0e75b\": rpc error: code = NotFound desc = could not find container \"04f0527c9d11607c6c8ecbce6905c23b97bd59be9ff1114e19637958d9d0e75b\": container with ID starting with 04f0527c9d11607c6c8ecbce6905c23b97bd59be9ff1114e19637958d9d0e75b not found: ID does not exist" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.382462 4786 scope.go:117] "RemoveContainer" containerID="09a81c234a0a01beaa914d4bb4690513c15b70508dfb354ade4c1265f9a82d49" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.386488 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.387499 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.389155 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.411856 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k79gn\" (UniqueName: \"kubernetes.io/projected/543ff50b-2b00-4425-9842-6a02a6b5d3c5-kube-api-access-k79gn\") pod \"nova-metadata-0\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " pod="openstack/nova-metadata-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.412011 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/543ff50b-2b00-4425-9842-6a02a6b5d3c5-config-data\") pod \"nova-metadata-0\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " pod="openstack/nova-metadata-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.412234 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543ff50b-2b00-4425-9842-6a02a6b5d3c5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " pod="openstack/nova-metadata-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.412390 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/543ff50b-2b00-4425-9842-6a02a6b5d3c5-logs\") pod \"nova-metadata-0\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " pod="openstack/nova-metadata-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.412465 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/543ff50b-2b00-4425-9842-6a02a6b5d3c5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " pod="openstack/nova-metadata-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.422128 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.514735 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543ff50b-2b00-4425-9842-6a02a6b5d3c5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " pod="openstack/nova-metadata-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.514849 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/543ff50b-2b00-4425-9842-6a02a6b5d3c5-logs\") pod \"nova-metadata-0\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " pod="openstack/nova-metadata-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.514886 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/543ff50b-2b00-4425-9842-6a02a6b5d3c5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " pod="openstack/nova-metadata-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.514932 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/729649ff-3385-42e4-a708-b174e19abe33-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"729649ff-3385-42e4-a708-b174e19abe33\") " pod="openstack/nova-scheduler-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.515170 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/729649ff-3385-42e4-a708-b174e19abe33-config-data\") pod \"nova-scheduler-0\" (UID: \"729649ff-3385-42e4-a708-b174e19abe33\") " pod="openstack/nova-scheduler-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.515234 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k79gn\" (UniqueName: \"kubernetes.io/projected/543ff50b-2b00-4425-9842-6a02a6b5d3c5-kube-api-access-k79gn\") pod \"nova-metadata-0\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " pod="openstack/nova-metadata-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.515250 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v5nl\" (UniqueName: \"kubernetes.io/projected/729649ff-3385-42e4-a708-b174e19abe33-kube-api-access-8v5nl\") pod \"nova-scheduler-0\" (UID: \"729649ff-3385-42e4-a708-b174e19abe33\") " pod="openstack/nova-scheduler-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.515277 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/543ff50b-2b00-4425-9842-6a02a6b5d3c5-logs\") pod \"nova-metadata-0\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " pod="openstack/nova-metadata-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.515307 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/543ff50b-2b00-4425-9842-6a02a6b5d3c5-config-data\") pod \"nova-metadata-0\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " pod="openstack/nova-metadata-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.519111 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/543ff50b-2b00-4425-9842-6a02a6b5d3c5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " pod="openstack/nova-metadata-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.519733 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543ff50b-2b00-4425-9842-6a02a6b5d3c5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " pod="openstack/nova-metadata-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.519991 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/543ff50b-2b00-4425-9842-6a02a6b5d3c5-config-data\") pod \"nova-metadata-0\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " pod="openstack/nova-metadata-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.529599 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k79gn\" (UniqueName: \"kubernetes.io/projected/543ff50b-2b00-4425-9842-6a02a6b5d3c5-kube-api-access-k79gn\") pod \"nova-metadata-0\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " pod="openstack/nova-metadata-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.617455 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/729649ff-3385-42e4-a708-b174e19abe33-config-data\") pod \"nova-scheduler-0\" (UID: \"729649ff-3385-42e4-a708-b174e19abe33\") " pod="openstack/nova-scheduler-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.617495 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v5nl\" (UniqueName: \"kubernetes.io/projected/729649ff-3385-42e4-a708-b174e19abe33-kube-api-access-8v5nl\") pod \"nova-scheduler-0\" (UID: \"729649ff-3385-42e4-a708-b174e19abe33\") " pod="openstack/nova-scheduler-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.617659 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/729649ff-3385-42e4-a708-b174e19abe33-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"729649ff-3385-42e4-a708-b174e19abe33\") " pod="openstack/nova-scheduler-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.620933 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/729649ff-3385-42e4-a708-b174e19abe33-config-data\") pod \"nova-scheduler-0\" (UID: \"729649ff-3385-42e4-a708-b174e19abe33\") " pod="openstack/nova-scheduler-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.621045 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/729649ff-3385-42e4-a708-b174e19abe33-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"729649ff-3385-42e4-a708-b174e19abe33\") " pod="openstack/nova-scheduler-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.632149 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v5nl\" (UniqueName: \"kubernetes.io/projected/729649ff-3385-42e4-a708-b174e19abe33-kube-api-access-8v5nl\") pod \"nova-scheduler-0\" (UID: \"729649ff-3385-42e4-a708-b174e19abe33\") " pod="openstack/nova-scheduler-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.673374 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 07:03:04 crc kubenswrapper[4786]: I1002 07:03:04.723029 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.035187 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 07:03:05 crc kubenswrapper[4786]: W1002 07:03:05.037369 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod543ff50b_2b00_4425_9842_6a02a6b5d3c5.slice/crio-608b55785bfb5a2a4c98344ca977bed47a1f368ae13ed16e1ba64bd8bd2fc9cd WatchSource:0}: Error finding container 608b55785bfb5a2a4c98344ca977bed47a1f368ae13ed16e1ba64bd8bd2fc9cd: Status 404 returned error can't find the container with id 608b55785bfb5a2a4c98344ca977bed47a1f368ae13ed16e1ba64bd8bd2fc9cd Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.128526 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.300581 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"729649ff-3385-42e4-a708-b174e19abe33","Type":"ContainerStarted","Data":"bb576784bbebf1fc22365eaba90a170972dc723fbf45cd2777eca641ecad8679"} Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.300629 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"729649ff-3385-42e4-a708-b174e19abe33","Type":"ContainerStarted","Data":"301866e46f4e22704afaaa60110fde4534952936e6ef653401f3ac7ae3ac0584"} Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.301988 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"543ff50b-2b00-4425-9842-6a02a6b5d3c5","Type":"ContainerStarted","Data":"5a2578412427231cd0a7254ca8d16d3a4fc8cd61194df2720a46540a928db8d4"} Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.302018 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"543ff50b-2b00-4425-9842-6a02a6b5d3c5","Type":"ContainerStarted","Data":"4cf059fb2a370cf51dcfb3f92f5dc869b3fb3751f2d025e67d6289d19499d638"} Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.302028 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"543ff50b-2b00-4425-9842-6a02a6b5d3c5","Type":"ContainerStarted","Data":"608b55785bfb5a2a4c98344ca977bed47a1f368ae13ed16e1ba64bd8bd2fc9cd"} Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.316226 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.316216386 podStartE2EDuration="1.316216386s" podCreationTimestamp="2025-10-02 07:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:03:05.310907309 +0000 UTC m=+995.432090450" watchObservedRunningTime="2025-10-02 07:03:05.316216386 +0000 UTC m=+995.437399517" Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.849729 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.870196 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.8701818810000002 podStartE2EDuration="1.870181881s" podCreationTimestamp="2025-10-02 07:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:03:05.329912062 +0000 UTC m=+995.451095203" watchObservedRunningTime="2025-10-02 07:03:05.870181881 +0000 UTC m=+995.991365012" Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.941417 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-combined-ca-bundle\") pod \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.941467 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-run-httpd\") pod \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.941485 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-scripts\") pod \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.941508 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-log-httpd\") pod \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.941548 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcdmf\" (UniqueName: \"kubernetes.io/projected/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-kube-api-access-xcdmf\") pod \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.941566 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-sg-core-conf-yaml\") pod \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.941633 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-config-data\") pod \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\" (UID: \"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb\") " Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.942069 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" (UID: "b0fbbc36-d2d7-4c19-aec8-baa1430a45bb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.942239 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" (UID: "b0fbbc36-d2d7-4c19-aec8-baa1430a45bb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.946361 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-scripts" (OuterVolumeSpecName: "scripts") pod "b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" (UID: "b0fbbc36-d2d7-4c19-aec8-baa1430a45bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.958846 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-kube-api-access-xcdmf" (OuterVolumeSpecName: "kube-api-access-xcdmf") pod "b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" (UID: "b0fbbc36-d2d7-4c19-aec8-baa1430a45bb"). InnerVolumeSpecName "kube-api-access-xcdmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:03:05 crc kubenswrapper[4786]: I1002 07:03:05.962667 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" (UID: "b0fbbc36-d2d7-4c19-aec8-baa1430a45bb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.007453 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" (UID: "b0fbbc36-d2d7-4c19-aec8-baa1430a45bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.012465 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-config-data" (OuterVolumeSpecName: "config-data") pod "b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" (UID: "b0fbbc36-d2d7-4c19-aec8-baa1430a45bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.043718 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.043741 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.043750 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.043761 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.043770 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcdmf\" (UniqueName: \"kubernetes.io/projected/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-kube-api-access-xcdmf\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.043781 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.043788 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.186496 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d776183-8989-4482-8810-cf37b4a85669" path="/var/lib/kubelet/pods/8d776183-8989-4482-8810-cf37b4a85669/volumes" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.187020 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b630beef-e3c4-4a29-bbc4-d43349b02284" path="/var/lib/kubelet/pods/b630beef-e3c4-4a29-bbc4-d43349b02284/volumes" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.310292 4786 generic.go:334] "Generic (PLEG): container finished" podID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerID="2a97632f1d6272d6d6ebec0eaa091587a9929aa2866b878c4dae6ca5eae9da17" exitCode=0 Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.310975 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.311388 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb","Type":"ContainerDied","Data":"2a97632f1d6272d6d6ebec0eaa091587a9929aa2866b878c4dae6ca5eae9da17"} Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.311414 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0fbbc36-d2d7-4c19-aec8-baa1430a45bb","Type":"ContainerDied","Data":"1211e0249ec700ac84335622a787b935ac2460625f807cf683edea38fa1767f8"} Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.311429 4786 scope.go:117] "RemoveContainer" containerID="8e6199e159bd6a10e91dbeff8b9d3e1857df57445bcc7b1e602b9d937af3ba53" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.328076 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.334141 4786 scope.go:117] "RemoveContainer" containerID="af7338b7964d8b1058688ec65332fae43b53ef005bb2d07de01d1279c4bbb4b9" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.339163 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.348453 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:03:06 crc kubenswrapper[4786]: E1002 07:03:06.349072 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerName="ceilometer-notification-agent" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.349092 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerName="ceilometer-notification-agent" Oct 02 07:03:06 crc kubenswrapper[4786]: E1002 07:03:06.349117 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerName="proxy-httpd" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.349122 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerName="proxy-httpd" Oct 02 07:03:06 crc kubenswrapper[4786]: E1002 07:03:06.349140 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerName="sg-core" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.349145 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerName="sg-core" Oct 02 07:03:06 crc kubenswrapper[4786]: E1002 07:03:06.349160 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerName="ceilometer-central-agent" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.349165 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerName="ceilometer-central-agent" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.349330 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerName="ceilometer-central-agent" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.349343 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerName="ceilometer-notification-agent" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.355249 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerName="proxy-httpd" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.355291 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" containerName="sg-core" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.351969 4786 scope.go:117] "RemoveContainer" containerID="2a97632f1d6272d6d6ebec0eaa091587a9929aa2866b878c4dae6ca5eae9da17" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.356721 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.360290 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.360509 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.360520 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.361422 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.375213 4786 scope.go:117] "RemoveContainer" containerID="367d9e28f41fff6f9ef9091dab0be56a6f1742b9ba147ece3671dbaa45f15fe1" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.389296 4786 scope.go:117] "RemoveContainer" containerID="8e6199e159bd6a10e91dbeff8b9d3e1857df57445bcc7b1e602b9d937af3ba53" Oct 02 07:03:06 crc kubenswrapper[4786]: E1002 07:03:06.389558 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e6199e159bd6a10e91dbeff8b9d3e1857df57445bcc7b1e602b9d937af3ba53\": container with ID starting with 8e6199e159bd6a10e91dbeff8b9d3e1857df57445bcc7b1e602b9d937af3ba53 not found: ID does not exist" containerID="8e6199e159bd6a10e91dbeff8b9d3e1857df57445bcc7b1e602b9d937af3ba53" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.389595 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e6199e159bd6a10e91dbeff8b9d3e1857df57445bcc7b1e602b9d937af3ba53"} err="failed to get container status \"8e6199e159bd6a10e91dbeff8b9d3e1857df57445bcc7b1e602b9d937af3ba53\": rpc error: code = NotFound desc = could not find container \"8e6199e159bd6a10e91dbeff8b9d3e1857df57445bcc7b1e602b9d937af3ba53\": container with ID starting with 8e6199e159bd6a10e91dbeff8b9d3e1857df57445bcc7b1e602b9d937af3ba53 not found: ID does not exist" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.389616 4786 scope.go:117] "RemoveContainer" containerID="af7338b7964d8b1058688ec65332fae43b53ef005bb2d07de01d1279c4bbb4b9" Oct 02 07:03:06 crc kubenswrapper[4786]: E1002 07:03:06.389900 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af7338b7964d8b1058688ec65332fae43b53ef005bb2d07de01d1279c4bbb4b9\": container with ID starting with af7338b7964d8b1058688ec65332fae43b53ef005bb2d07de01d1279c4bbb4b9 not found: ID does not exist" containerID="af7338b7964d8b1058688ec65332fae43b53ef005bb2d07de01d1279c4bbb4b9" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.389938 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af7338b7964d8b1058688ec65332fae43b53ef005bb2d07de01d1279c4bbb4b9"} err="failed to get container status \"af7338b7964d8b1058688ec65332fae43b53ef005bb2d07de01d1279c4bbb4b9\": rpc error: code = NotFound desc = could not find container \"af7338b7964d8b1058688ec65332fae43b53ef005bb2d07de01d1279c4bbb4b9\": container with ID starting with af7338b7964d8b1058688ec65332fae43b53ef005bb2d07de01d1279c4bbb4b9 not found: ID does not exist" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.389962 4786 scope.go:117] "RemoveContainer" containerID="2a97632f1d6272d6d6ebec0eaa091587a9929aa2866b878c4dae6ca5eae9da17" Oct 02 07:03:06 crc kubenswrapper[4786]: E1002 07:03:06.390221 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a97632f1d6272d6d6ebec0eaa091587a9929aa2866b878c4dae6ca5eae9da17\": container with ID starting with 2a97632f1d6272d6d6ebec0eaa091587a9929aa2866b878c4dae6ca5eae9da17 not found: ID does not exist" containerID="2a97632f1d6272d6d6ebec0eaa091587a9929aa2866b878c4dae6ca5eae9da17" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.390243 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a97632f1d6272d6d6ebec0eaa091587a9929aa2866b878c4dae6ca5eae9da17"} err="failed to get container status \"2a97632f1d6272d6d6ebec0eaa091587a9929aa2866b878c4dae6ca5eae9da17\": rpc error: code = NotFound desc = could not find container \"2a97632f1d6272d6d6ebec0eaa091587a9929aa2866b878c4dae6ca5eae9da17\": container with ID starting with 2a97632f1d6272d6d6ebec0eaa091587a9929aa2866b878c4dae6ca5eae9da17 not found: ID does not exist" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.390256 4786 scope.go:117] "RemoveContainer" containerID="367d9e28f41fff6f9ef9091dab0be56a6f1742b9ba147ece3671dbaa45f15fe1" Oct 02 07:03:06 crc kubenswrapper[4786]: E1002 07:03:06.390463 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"367d9e28f41fff6f9ef9091dab0be56a6f1742b9ba147ece3671dbaa45f15fe1\": container with ID starting with 367d9e28f41fff6f9ef9091dab0be56a6f1742b9ba147ece3671dbaa45f15fe1 not found: ID does not exist" containerID="367d9e28f41fff6f9ef9091dab0be56a6f1742b9ba147ece3671dbaa45f15fe1" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.390482 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"367d9e28f41fff6f9ef9091dab0be56a6f1742b9ba147ece3671dbaa45f15fe1"} err="failed to get container status \"367d9e28f41fff6f9ef9091dab0be56a6f1742b9ba147ece3671dbaa45f15fe1\": rpc error: code = NotFound desc = could not find container \"367d9e28f41fff6f9ef9091dab0be56a6f1742b9ba147ece3671dbaa45f15fe1\": container with ID starting with 367d9e28f41fff6f9ef9091dab0be56a6f1742b9ba147ece3671dbaa45f15fe1 not found: ID does not exist" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.449905 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-scripts\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.449963 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.450015 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb965\" (UniqueName: \"kubernetes.io/projected/2a512766-0b0f-4fff-8499-b6d42bc6facf-kube-api-access-xb965\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.450052 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a512766-0b0f-4fff-8499-b6d42bc6facf-run-httpd\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.450373 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.450417 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a512766-0b0f-4fff-8499-b6d42bc6facf-log-httpd\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.450461 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.450564 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-config-data\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.551936 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-config-data\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.552032 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-scripts\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.552066 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.552112 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb965\" (UniqueName: \"kubernetes.io/projected/2a512766-0b0f-4fff-8499-b6d42bc6facf-kube-api-access-xb965\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.552149 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a512766-0b0f-4fff-8499-b6d42bc6facf-run-httpd\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.552267 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.552297 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a512766-0b0f-4fff-8499-b6d42bc6facf-log-httpd\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.552342 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.552763 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a512766-0b0f-4fff-8499-b6d42bc6facf-run-httpd\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.552870 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a512766-0b0f-4fff-8499-b6d42bc6facf-log-httpd\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.560131 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-scripts\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.560247 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.560439 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-config-data\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.560519 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.560606 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.565426 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb965\" (UniqueName: \"kubernetes.io/projected/2a512766-0b0f-4fff-8499-b6d42bc6facf-kube-api-access-xb965\") pod \"ceilometer-0\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " pod="openstack/ceilometer-0" Oct 02 07:03:06 crc kubenswrapper[4786]: I1002 07:03:06.675434 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:03:07 crc kubenswrapper[4786]: I1002 07:03:07.064326 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:03:07 crc kubenswrapper[4786]: W1002 07:03:07.070066 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a512766_0b0f_4fff_8499_b6d42bc6facf.slice/crio-52dd636a184c5bd0d6bf444bf29353bb158969a565a6ee24111a38f3a20e56e9 WatchSource:0}: Error finding container 52dd636a184c5bd0d6bf444bf29353bb158969a565a6ee24111a38f3a20e56e9: Status 404 returned error can't find the container with id 52dd636a184c5bd0d6bf444bf29353bb158969a565a6ee24111a38f3a20e56e9 Oct 02 07:03:07 crc kubenswrapper[4786]: I1002 07:03:07.318816 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a512766-0b0f-4fff-8499-b6d42bc6facf","Type":"ContainerStarted","Data":"52dd636a184c5bd0d6bf444bf29353bb158969a565a6ee24111a38f3a20e56e9"} Oct 02 07:03:08 crc kubenswrapper[4786]: I1002 07:03:08.186973 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0fbbc36-d2d7-4c19-aec8-baa1430a45bb" path="/var/lib/kubelet/pods/b0fbbc36-d2d7-4c19-aec8-baa1430a45bb/volumes" Oct 02 07:03:08 crc kubenswrapper[4786]: I1002 07:03:08.326559 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a512766-0b0f-4fff-8499-b6d42bc6facf","Type":"ContainerStarted","Data":"4fcdd452ea3b689b78f8769d16e624f5a19da4e116798c07e120339802a9c070"} Oct 02 07:03:09 crc kubenswrapper[4786]: I1002 07:03:09.336006 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a512766-0b0f-4fff-8499-b6d42bc6facf","Type":"ContainerStarted","Data":"85f2f9c60fe6932a7681e7c55ff3f628aa47c9bcef6cac5f6615698fe52ece62"} Oct 02 07:03:09 crc kubenswrapper[4786]: I1002 07:03:09.576049 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 02 07:03:09 crc kubenswrapper[4786]: I1002 07:03:09.673741 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 07:03:09 crc kubenswrapper[4786]: I1002 07:03:09.674587 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 07:03:09 crc kubenswrapper[4786]: I1002 07:03:09.723893 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 07:03:10 crc kubenswrapper[4786]: I1002 07:03:10.344033 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a512766-0b0f-4fff-8499-b6d42bc6facf","Type":"ContainerStarted","Data":"ea017d44578d201c41ad7bc0265ca7274dda7aa37017f7fa57a1a2149c642ff8"} Oct 02 07:03:10 crc kubenswrapper[4786]: I1002 07:03:10.607220 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 02 07:03:11 crc kubenswrapper[4786]: I1002 07:03:11.354511 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a512766-0b0f-4fff-8499-b6d42bc6facf","Type":"ContainerStarted","Data":"578fc1f94a2c11c8fd381243558c5e5d2e744e88fdaf1279a9f2ef2242a68200"} Oct 02 07:03:11 crc kubenswrapper[4786]: I1002 07:03:11.355175 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 07:03:12 crc kubenswrapper[4786]: I1002 07:03:12.611407 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 07:03:12 crc kubenswrapper[4786]: I1002 07:03:12.611451 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 07:03:13 crc kubenswrapper[4786]: I1002 07:03:13.693838 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3e85150e-5e6c-4676-8c0e-ed72db605bc0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 07:03:13 crc kubenswrapper[4786]: I1002 07:03:13.693851 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3e85150e-5e6c-4676-8c0e-ed72db605bc0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 07:03:14 crc kubenswrapper[4786]: I1002 07:03:14.673716 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 07:03:14 crc kubenswrapper[4786]: I1002 07:03:14.673757 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 07:03:14 crc kubenswrapper[4786]: I1002 07:03:14.723776 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 07:03:14 crc kubenswrapper[4786]: I1002 07:03:14.743160 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 07:03:14 crc kubenswrapper[4786]: I1002 07:03:14.757975 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.859518451 podStartE2EDuration="8.757963128s" podCreationTimestamp="2025-10-02 07:03:06 +0000 UTC" firstStartedPulling="2025-10-02 07:03:07.074079137 +0000 UTC m=+997.195262258" lastFinishedPulling="2025-10-02 07:03:10.972523804 +0000 UTC m=+1001.093706935" observedRunningTime="2025-10-02 07:03:11.373328912 +0000 UTC m=+1001.494512044" watchObservedRunningTime="2025-10-02 07:03:14.757963128 +0000 UTC m=+1004.879146258" Oct 02 07:03:15 crc kubenswrapper[4786]: I1002 07:03:15.399778 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 07:03:15 crc kubenswrapper[4786]: I1002 07:03:15.685784 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="543ff50b-2b00-4425-9842-6a02a6b5d3c5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 07:03:15 crc kubenswrapper[4786]: I1002 07:03:15.686012 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="543ff50b-2b00-4425-9842-6a02a6b5d3c5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 07:03:22 crc kubenswrapper[4786]: I1002 07:03:22.613813 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 07:03:22 crc kubenswrapper[4786]: I1002 07:03:22.614389 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 07:03:22 crc kubenswrapper[4786]: I1002 07:03:22.616072 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 07:03:22 crc kubenswrapper[4786]: I1002 07:03:22.616679 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.429069 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.431566 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.547326 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-558cfbd59c-vlkt9"] Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.548585 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.562850 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-558cfbd59c-vlkt9"] Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.613667 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-dns-swift-storage-0\") pod \"dnsmasq-dns-558cfbd59c-vlkt9\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.613999 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-ovsdbserver-sb\") pod \"dnsmasq-dns-558cfbd59c-vlkt9\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.614149 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-dns-svc\") pod \"dnsmasq-dns-558cfbd59c-vlkt9\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.614189 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-ovsdbserver-nb\") pod \"dnsmasq-dns-558cfbd59c-vlkt9\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.614243 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbqvl\" (UniqueName: \"kubernetes.io/projected/aa599fa1-31c3-4f98-a942-c67de6fe96e7-kube-api-access-kbqvl\") pod \"dnsmasq-dns-558cfbd59c-vlkt9\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.614287 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-config\") pod \"dnsmasq-dns-558cfbd59c-vlkt9\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.715816 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-ovsdbserver-sb\") pod \"dnsmasq-dns-558cfbd59c-vlkt9\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.715923 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-dns-svc\") pod \"dnsmasq-dns-558cfbd59c-vlkt9\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.715954 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-ovsdbserver-nb\") pod \"dnsmasq-dns-558cfbd59c-vlkt9\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.715991 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbqvl\" (UniqueName: \"kubernetes.io/projected/aa599fa1-31c3-4f98-a942-c67de6fe96e7-kube-api-access-kbqvl\") pod \"dnsmasq-dns-558cfbd59c-vlkt9\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.716026 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-config\") pod \"dnsmasq-dns-558cfbd59c-vlkt9\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.716043 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-dns-swift-storage-0\") pod \"dnsmasq-dns-558cfbd59c-vlkt9\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.716932 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-dns-swift-storage-0\") pod \"dnsmasq-dns-558cfbd59c-vlkt9\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.716965 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-ovsdbserver-nb\") pod \"dnsmasq-dns-558cfbd59c-vlkt9\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.717014 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-ovsdbserver-sb\") pod \"dnsmasq-dns-558cfbd59c-vlkt9\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.717053 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-config\") pod \"dnsmasq-dns-558cfbd59c-vlkt9\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.717180 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-dns-svc\") pod \"dnsmasq-dns-558cfbd59c-vlkt9\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.733316 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbqvl\" (UniqueName: \"kubernetes.io/projected/aa599fa1-31c3-4f98-a942-c67de6fe96e7-kube-api-access-kbqvl\") pod \"dnsmasq-dns-558cfbd59c-vlkt9\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:23 crc kubenswrapper[4786]: I1002 07:03:23.868677 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:24 crc kubenswrapper[4786]: I1002 07:03:24.258449 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-558cfbd59c-vlkt9"] Oct 02 07:03:24 crc kubenswrapper[4786]: W1002 07:03:24.261397 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa599fa1_31c3_4f98_a942_c67de6fe96e7.slice/crio-d3dcafcde0239eb6835d9c97b45697b8f7a450edbacdbcee5e4b6f16e9c387d2 WatchSource:0}: Error finding container d3dcafcde0239eb6835d9c97b45697b8f7a450edbacdbcee5e4b6f16e9c387d2: Status 404 returned error can't find the container with id d3dcafcde0239eb6835d9c97b45697b8f7a450edbacdbcee5e4b6f16e9c387d2 Oct 02 07:03:24 crc kubenswrapper[4786]: I1002 07:03:24.437880 4786 generic.go:334] "Generic (PLEG): container finished" podID="aa599fa1-31c3-4f98-a942-c67de6fe96e7" containerID="0f55e384f31c0f46a84ed92ed94e14f6bbf902a583e7203a0abd5cf275c6f36b" exitCode=0 Oct 02 07:03:24 crc kubenswrapper[4786]: I1002 07:03:24.437979 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" event={"ID":"aa599fa1-31c3-4f98-a942-c67de6fe96e7","Type":"ContainerDied","Data":"0f55e384f31c0f46a84ed92ed94e14f6bbf902a583e7203a0abd5cf275c6f36b"} Oct 02 07:03:24 crc kubenswrapper[4786]: I1002 07:03:24.438018 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" event={"ID":"aa599fa1-31c3-4f98-a942-c67de6fe96e7","Type":"ContainerStarted","Data":"d3dcafcde0239eb6835d9c97b45697b8f7a450edbacdbcee5e4b6f16e9c387d2"} Oct 02 07:03:24 crc kubenswrapper[4786]: I1002 07:03:24.679681 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 07:03:24 crc kubenswrapper[4786]: I1002 07:03:24.682556 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 07:03:24 crc kubenswrapper[4786]: I1002 07:03:24.682713 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 07:03:25 crc kubenswrapper[4786]: I1002 07:03:25.128875 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:03:25 crc kubenswrapper[4786]: I1002 07:03:25.129329 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerName="ceilometer-central-agent" containerID="cri-o://4fcdd452ea3b689b78f8769d16e624f5a19da4e116798c07e120339802a9c070" gracePeriod=30 Oct 02 07:03:25 crc kubenswrapper[4786]: I1002 07:03:25.129423 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerName="proxy-httpd" containerID="cri-o://578fc1f94a2c11c8fd381243558c5e5d2e744e88fdaf1279a9f2ef2242a68200" gracePeriod=30 Oct 02 07:03:25 crc kubenswrapper[4786]: I1002 07:03:25.129467 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerName="sg-core" containerID="cri-o://ea017d44578d201c41ad7bc0265ca7274dda7aa37017f7fa57a1a2149c642ff8" gracePeriod=30 Oct 02 07:03:25 crc kubenswrapper[4786]: I1002 07:03:25.129511 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerName="ceilometer-notification-agent" containerID="cri-o://85f2f9c60fe6932a7681e7c55ff3f628aa47c9bcef6cac5f6615698fe52ece62" gracePeriod=30 Oct 02 07:03:25 crc kubenswrapper[4786]: I1002 07:03:25.139422 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.192:3000/\": EOF" Oct 02 07:03:25 crc kubenswrapper[4786]: I1002 07:03:25.447248 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" event={"ID":"aa599fa1-31c3-4f98-a942-c67de6fe96e7","Type":"ContainerStarted","Data":"1382e223f76d17ea21d4d68197de2473f46c2f0a384338d0f71e7972658b1cd2"} Oct 02 07:03:25 crc kubenswrapper[4786]: I1002 07:03:25.447992 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:25 crc kubenswrapper[4786]: I1002 07:03:25.451441 4786 generic.go:334] "Generic (PLEG): container finished" podID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerID="578fc1f94a2c11c8fd381243558c5e5d2e744e88fdaf1279a9f2ef2242a68200" exitCode=0 Oct 02 07:03:25 crc kubenswrapper[4786]: I1002 07:03:25.451469 4786 generic.go:334] "Generic (PLEG): container finished" podID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerID="ea017d44578d201c41ad7bc0265ca7274dda7aa37017f7fa57a1a2149c642ff8" exitCode=2 Oct 02 07:03:25 crc kubenswrapper[4786]: I1002 07:03:25.453311 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a512766-0b0f-4fff-8499-b6d42bc6facf","Type":"ContainerDied","Data":"578fc1f94a2c11c8fd381243558c5e5d2e744e88fdaf1279a9f2ef2242a68200"} Oct 02 07:03:25 crc kubenswrapper[4786]: I1002 07:03:25.453356 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a512766-0b0f-4fff-8499-b6d42bc6facf","Type":"ContainerDied","Data":"ea017d44578d201c41ad7bc0265ca7274dda7aa37017f7fa57a1a2149c642ff8"} Oct 02 07:03:25 crc kubenswrapper[4786]: I1002 07:03:25.461232 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 07:03:25 crc kubenswrapper[4786]: I1002 07:03:25.464344 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" podStartSLOduration=2.46432593 podStartE2EDuration="2.46432593s" podCreationTimestamp="2025-10-02 07:03:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:03:25.46198598 +0000 UTC m=+1015.583169131" watchObservedRunningTime="2025-10-02 07:03:25.46432593 +0000 UTC m=+1015.585509062" Oct 02 07:03:25 crc kubenswrapper[4786]: I1002 07:03:25.755041 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 07:03:26 crc kubenswrapper[4786]: I1002 07:03:26.463024 4786 generic.go:334] "Generic (PLEG): container finished" podID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerID="4fcdd452ea3b689b78f8769d16e624f5a19da4e116798c07e120339802a9c070" exitCode=0 Oct 02 07:03:26 crc kubenswrapper[4786]: I1002 07:03:26.463225 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a512766-0b0f-4fff-8499-b6d42bc6facf","Type":"ContainerDied","Data":"4fcdd452ea3b689b78f8769d16e624f5a19da4e116798c07e120339802a9c070"} Oct 02 07:03:26 crc kubenswrapper[4786]: I1002 07:03:26.463360 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3e85150e-5e6c-4676-8c0e-ed72db605bc0" containerName="nova-api-log" containerID="cri-o://48c7e7fc5707caa14b320272d7ef7901ccd001bc7434c05a48cf01952952e9d4" gracePeriod=30 Oct 02 07:03:26 crc kubenswrapper[4786]: I1002 07:03:26.463446 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3e85150e-5e6c-4676-8c0e-ed72db605bc0" containerName="nova-api-api" containerID="cri-o://e86c3ba991ee23e50aa4f16eb8a1014ca0fdb2d83ba5407fd4ac6df316fc9542" gracePeriod=30 Oct 02 07:03:27 crc kubenswrapper[4786]: I1002 07:03:27.470805 4786 generic.go:334] "Generic (PLEG): container finished" podID="3e85150e-5e6c-4676-8c0e-ed72db605bc0" containerID="48c7e7fc5707caa14b320272d7ef7901ccd001bc7434c05a48cf01952952e9d4" exitCode=143 Oct 02 07:03:27 crc kubenswrapper[4786]: I1002 07:03:27.471388 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e85150e-5e6c-4676-8c0e-ed72db605bc0","Type":"ContainerDied","Data":"48c7e7fc5707caa14b320272d7ef7901ccd001bc7434c05a48cf01952952e9d4"} Oct 02 07:03:27 crc kubenswrapper[4786]: I1002 07:03:27.472660 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:27 crc kubenswrapper[4786]: I1002 07:03:27.473210 4786 generic.go:334] "Generic (PLEG): container finished" podID="a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24" containerID="b480c5c55f30afc7a2a1beb2ff5c5f3d8284b48fa75777e1fc445b594e1fe4d0" exitCode=137 Oct 02 07:03:27 crc kubenswrapper[4786]: I1002 07:03:27.473308 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24","Type":"ContainerDied","Data":"b480c5c55f30afc7a2a1beb2ff5c5f3d8284b48fa75777e1fc445b594e1fe4d0"} Oct 02 07:03:27 crc kubenswrapper[4786]: I1002 07:03:27.473363 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24","Type":"ContainerDied","Data":"575c71367a80cd7e6e82ec07a5c1ec09f34adcb17942c17ba7ee3aa04df6f943"} Oct 02 07:03:27 crc kubenswrapper[4786]: I1002 07:03:27.473382 4786 scope.go:117] "RemoveContainer" containerID="b480c5c55f30afc7a2a1beb2ff5c5f3d8284b48fa75777e1fc445b594e1fe4d0" Oct 02 07:03:27 crc kubenswrapper[4786]: I1002 07:03:27.496679 4786 scope.go:117] "RemoveContainer" containerID="b480c5c55f30afc7a2a1beb2ff5c5f3d8284b48fa75777e1fc445b594e1fe4d0" Oct 02 07:03:27 crc kubenswrapper[4786]: E1002 07:03:27.498143 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b480c5c55f30afc7a2a1beb2ff5c5f3d8284b48fa75777e1fc445b594e1fe4d0\": container with ID starting with b480c5c55f30afc7a2a1beb2ff5c5f3d8284b48fa75777e1fc445b594e1fe4d0 not found: ID does not exist" containerID="b480c5c55f30afc7a2a1beb2ff5c5f3d8284b48fa75777e1fc445b594e1fe4d0" Oct 02 07:03:27 crc kubenswrapper[4786]: I1002 07:03:27.498178 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b480c5c55f30afc7a2a1beb2ff5c5f3d8284b48fa75777e1fc445b594e1fe4d0"} err="failed to get container status \"b480c5c55f30afc7a2a1beb2ff5c5f3d8284b48fa75777e1fc445b594e1fe4d0\": rpc error: code = NotFound desc = could not find container \"b480c5c55f30afc7a2a1beb2ff5c5f3d8284b48fa75777e1fc445b594e1fe4d0\": container with ID starting with b480c5c55f30afc7a2a1beb2ff5c5f3d8284b48fa75777e1fc445b594e1fe4d0 not found: ID does not exist" Oct 02 07:03:27 crc kubenswrapper[4786]: I1002 07:03:27.598733 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24-combined-ca-bundle\") pod \"a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24\" (UID: \"a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24\") " Oct 02 07:03:27 crc kubenswrapper[4786]: I1002 07:03:27.599125 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8lf5\" (UniqueName: \"kubernetes.io/projected/a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24-kube-api-access-f8lf5\") pod \"a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24\" (UID: \"a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24\") " Oct 02 07:03:27 crc kubenswrapper[4786]: I1002 07:03:27.599367 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24-config-data\") pod \"a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24\" (UID: \"a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24\") " Oct 02 07:03:27 crc kubenswrapper[4786]: I1002 07:03:27.609823 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24-kube-api-access-f8lf5" (OuterVolumeSpecName: "kube-api-access-f8lf5") pod "a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24" (UID: "a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24"). InnerVolumeSpecName "kube-api-access-f8lf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:03:27 crc kubenswrapper[4786]: I1002 07:03:27.620437 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24-config-data" (OuterVolumeSpecName: "config-data") pod "a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24" (UID: "a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:27 crc kubenswrapper[4786]: I1002 07:03:27.621885 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24" (UID: "a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:27 crc kubenswrapper[4786]: I1002 07:03:27.702063 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8lf5\" (UniqueName: \"kubernetes.io/projected/a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24-kube-api-access-f8lf5\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:27 crc kubenswrapper[4786]: I1002 07:03:27.702088 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:27 crc kubenswrapper[4786]: I1002 07:03:27.702097 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.413451 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.480884 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.484200 4786 generic.go:334] "Generic (PLEG): container finished" podID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerID="85f2f9c60fe6932a7681e7c55ff3f628aa47c9bcef6cac5f6615698fe52ece62" exitCode=0 Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.484317 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a512766-0b0f-4fff-8499-b6d42bc6facf","Type":"ContainerDied","Data":"85f2f9c60fe6932a7681e7c55ff3f628aa47c9bcef6cac5f6615698fe52ece62"} Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.484412 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a512766-0b0f-4fff-8499-b6d42bc6facf","Type":"ContainerDied","Data":"52dd636a184c5bd0d6bf444bf29353bb158969a565a6ee24111a38f3a20e56e9"} Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.484484 4786 scope.go:117] "RemoveContainer" containerID="578fc1f94a2c11c8fd381243558c5e5d2e744e88fdaf1279a9f2ef2242a68200" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.484608 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.501456 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.509975 4786 scope.go:117] "RemoveContainer" containerID="ea017d44578d201c41ad7bc0265ca7274dda7aa37017f7fa57a1a2149c642ff8" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.510655 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.514949 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-sg-core-conf-yaml\") pod \"2a512766-0b0f-4fff-8499-b6d42bc6facf\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.515873 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a512766-0b0f-4fff-8499-b6d42bc6facf-run-httpd\") pod \"2a512766-0b0f-4fff-8499-b6d42bc6facf\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.516250 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a512766-0b0f-4fff-8499-b6d42bc6facf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2a512766-0b0f-4fff-8499-b6d42bc6facf" (UID: "2a512766-0b0f-4fff-8499-b6d42bc6facf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.516269 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-ceilometer-tls-certs\") pod \"2a512766-0b0f-4fff-8499-b6d42bc6facf\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.516415 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-scripts\") pod \"2a512766-0b0f-4fff-8499-b6d42bc6facf\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.516493 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a512766-0b0f-4fff-8499-b6d42bc6facf-log-httpd\") pod \"2a512766-0b0f-4fff-8499-b6d42bc6facf\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.516557 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-combined-ca-bundle\") pod \"2a512766-0b0f-4fff-8499-b6d42bc6facf\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.516640 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-config-data\") pod \"2a512766-0b0f-4fff-8499-b6d42bc6facf\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.516764 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb965\" (UniqueName: \"kubernetes.io/projected/2a512766-0b0f-4fff-8499-b6d42bc6facf-kube-api-access-xb965\") pod \"2a512766-0b0f-4fff-8499-b6d42bc6facf\" (UID: \"2a512766-0b0f-4fff-8499-b6d42bc6facf\") " Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.517019 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a512766-0b0f-4fff-8499-b6d42bc6facf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2a512766-0b0f-4fff-8499-b6d42bc6facf" (UID: "2a512766-0b0f-4fff-8499-b6d42bc6facf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.517514 4786 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a512766-0b0f-4fff-8499-b6d42bc6facf-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.517601 4786 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a512766-0b0f-4fff-8499-b6d42bc6facf-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.519700 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-scripts" (OuterVolumeSpecName: "scripts") pod "2a512766-0b0f-4fff-8499-b6d42bc6facf" (UID: "2a512766-0b0f-4fff-8499-b6d42bc6facf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.521186 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 07:03:28 crc kubenswrapper[4786]: E1002 07:03:28.521522 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerName="proxy-httpd" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.521538 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerName="proxy-httpd" Oct 02 07:03:28 crc kubenswrapper[4786]: E1002 07:03:28.521550 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerName="sg-core" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.521556 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerName="sg-core" Oct 02 07:03:28 crc kubenswrapper[4786]: E1002 07:03:28.521574 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerName="ceilometer-notification-agent" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.521579 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerName="ceilometer-notification-agent" Oct 02 07:03:28 crc kubenswrapper[4786]: E1002 07:03:28.521595 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerName="ceilometer-central-agent" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.521600 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerName="ceilometer-central-agent" Oct 02 07:03:28 crc kubenswrapper[4786]: E1002 07:03:28.521633 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.521639 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.521627 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a512766-0b0f-4fff-8499-b6d42bc6facf-kube-api-access-xb965" (OuterVolumeSpecName: "kube-api-access-xb965") pod "2a512766-0b0f-4fff-8499-b6d42bc6facf" (UID: "2a512766-0b0f-4fff-8499-b6d42bc6facf"). InnerVolumeSpecName "kube-api-access-xb965". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.521803 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerName="ceilometer-notification-agent" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.521819 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerName="proxy-httpd" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.521829 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.521841 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerName="ceilometer-central-agent" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.521851 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a512766-0b0f-4fff-8499-b6d42bc6facf" containerName="sg-core" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.522352 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.524375 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.524444 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.528136 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.528335 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.537648 4786 scope.go:117] "RemoveContainer" containerID="85f2f9c60fe6932a7681e7c55ff3f628aa47c9bcef6cac5f6615698fe52ece62" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.553919 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2a512766-0b0f-4fff-8499-b6d42bc6facf" (UID: "2a512766-0b0f-4fff-8499-b6d42bc6facf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.555025 4786 scope.go:117] "RemoveContainer" containerID="4fcdd452ea3b689b78f8769d16e624f5a19da4e116798c07e120339802a9c070" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.565791 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2a512766-0b0f-4fff-8499-b6d42bc6facf" (UID: "2a512766-0b0f-4fff-8499-b6d42bc6facf"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.571137 4786 scope.go:117] "RemoveContainer" containerID="578fc1f94a2c11c8fd381243558c5e5d2e744e88fdaf1279a9f2ef2242a68200" Oct 02 07:03:28 crc kubenswrapper[4786]: E1002 07:03:28.571425 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"578fc1f94a2c11c8fd381243558c5e5d2e744e88fdaf1279a9f2ef2242a68200\": container with ID starting with 578fc1f94a2c11c8fd381243558c5e5d2e744e88fdaf1279a9f2ef2242a68200 not found: ID does not exist" containerID="578fc1f94a2c11c8fd381243558c5e5d2e744e88fdaf1279a9f2ef2242a68200" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.571455 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"578fc1f94a2c11c8fd381243558c5e5d2e744e88fdaf1279a9f2ef2242a68200"} err="failed to get container status \"578fc1f94a2c11c8fd381243558c5e5d2e744e88fdaf1279a9f2ef2242a68200\": rpc error: code = NotFound desc = could not find container \"578fc1f94a2c11c8fd381243558c5e5d2e744e88fdaf1279a9f2ef2242a68200\": container with ID starting with 578fc1f94a2c11c8fd381243558c5e5d2e744e88fdaf1279a9f2ef2242a68200 not found: ID does not exist" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.571476 4786 scope.go:117] "RemoveContainer" containerID="ea017d44578d201c41ad7bc0265ca7274dda7aa37017f7fa57a1a2149c642ff8" Oct 02 07:03:28 crc kubenswrapper[4786]: E1002 07:03:28.571760 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea017d44578d201c41ad7bc0265ca7274dda7aa37017f7fa57a1a2149c642ff8\": container with ID starting with ea017d44578d201c41ad7bc0265ca7274dda7aa37017f7fa57a1a2149c642ff8 not found: ID does not exist" containerID="ea017d44578d201c41ad7bc0265ca7274dda7aa37017f7fa57a1a2149c642ff8" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.571783 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea017d44578d201c41ad7bc0265ca7274dda7aa37017f7fa57a1a2149c642ff8"} err="failed to get container status \"ea017d44578d201c41ad7bc0265ca7274dda7aa37017f7fa57a1a2149c642ff8\": rpc error: code = NotFound desc = could not find container \"ea017d44578d201c41ad7bc0265ca7274dda7aa37017f7fa57a1a2149c642ff8\": container with ID starting with ea017d44578d201c41ad7bc0265ca7274dda7aa37017f7fa57a1a2149c642ff8 not found: ID does not exist" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.571799 4786 scope.go:117] "RemoveContainer" containerID="85f2f9c60fe6932a7681e7c55ff3f628aa47c9bcef6cac5f6615698fe52ece62" Oct 02 07:03:28 crc kubenswrapper[4786]: E1002 07:03:28.572337 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85f2f9c60fe6932a7681e7c55ff3f628aa47c9bcef6cac5f6615698fe52ece62\": container with ID starting with 85f2f9c60fe6932a7681e7c55ff3f628aa47c9bcef6cac5f6615698fe52ece62 not found: ID does not exist" containerID="85f2f9c60fe6932a7681e7c55ff3f628aa47c9bcef6cac5f6615698fe52ece62" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.572445 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f2f9c60fe6932a7681e7c55ff3f628aa47c9bcef6cac5f6615698fe52ece62"} err="failed to get container status \"85f2f9c60fe6932a7681e7c55ff3f628aa47c9bcef6cac5f6615698fe52ece62\": rpc error: code = NotFound desc = could not find container \"85f2f9c60fe6932a7681e7c55ff3f628aa47c9bcef6cac5f6615698fe52ece62\": container with ID starting with 85f2f9c60fe6932a7681e7c55ff3f628aa47c9bcef6cac5f6615698fe52ece62 not found: ID does not exist" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.572516 4786 scope.go:117] "RemoveContainer" containerID="4fcdd452ea3b689b78f8769d16e624f5a19da4e116798c07e120339802a9c070" Oct 02 07:03:28 crc kubenswrapper[4786]: E1002 07:03:28.572941 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fcdd452ea3b689b78f8769d16e624f5a19da4e116798c07e120339802a9c070\": container with ID starting with 4fcdd452ea3b689b78f8769d16e624f5a19da4e116798c07e120339802a9c070 not found: ID does not exist" containerID="4fcdd452ea3b689b78f8769d16e624f5a19da4e116798c07e120339802a9c070" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.572966 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fcdd452ea3b689b78f8769d16e624f5a19da4e116798c07e120339802a9c070"} err="failed to get container status \"4fcdd452ea3b689b78f8769d16e624f5a19da4e116798c07e120339802a9c070\": rpc error: code = NotFound desc = could not find container \"4fcdd452ea3b689b78f8769d16e624f5a19da4e116798c07e120339802a9c070\": container with ID starting with 4fcdd452ea3b689b78f8769d16e624f5a19da4e116798c07e120339802a9c070 not found: ID does not exist" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.576216 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a512766-0b0f-4fff-8499-b6d42bc6facf" (UID: "2a512766-0b0f-4fff-8499-b6d42bc6facf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.592243 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-config-data" (OuterVolumeSpecName: "config-data") pod "2a512766-0b0f-4fff-8499-b6d42bc6facf" (UID: "2a512766-0b0f-4fff-8499-b6d42bc6facf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.619514 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbb8h\" (UniqueName: \"kubernetes.io/projected/841f7177-db9e-4e73-a7b4-ccf30d1538fb-kube-api-access-nbb8h\") pod \"nova-cell1-novncproxy-0\" (UID: \"841f7177-db9e-4e73-a7b4-ccf30d1538fb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.619634 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/841f7177-db9e-4e73-a7b4-ccf30d1538fb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"841f7177-db9e-4e73-a7b4-ccf30d1538fb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.619681 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/841f7177-db9e-4e73-a7b4-ccf30d1538fb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"841f7177-db9e-4e73-a7b4-ccf30d1538fb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.619767 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841f7177-db9e-4e73-a7b4-ccf30d1538fb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"841f7177-db9e-4e73-a7b4-ccf30d1538fb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.619808 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/841f7177-db9e-4e73-a7b4-ccf30d1538fb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"841f7177-db9e-4e73-a7b4-ccf30d1538fb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.619868 4786 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.619878 4786 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.619887 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.619895 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.619902 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a512766-0b0f-4fff-8499-b6d42bc6facf-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.619913 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb965\" (UniqueName: \"kubernetes.io/projected/2a512766-0b0f-4fff-8499-b6d42bc6facf-kube-api-access-xb965\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.720830 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbb8h\" (UniqueName: \"kubernetes.io/projected/841f7177-db9e-4e73-a7b4-ccf30d1538fb-kube-api-access-nbb8h\") pod \"nova-cell1-novncproxy-0\" (UID: \"841f7177-db9e-4e73-a7b4-ccf30d1538fb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.720916 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/841f7177-db9e-4e73-a7b4-ccf30d1538fb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"841f7177-db9e-4e73-a7b4-ccf30d1538fb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.720954 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/841f7177-db9e-4e73-a7b4-ccf30d1538fb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"841f7177-db9e-4e73-a7b4-ccf30d1538fb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.721004 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841f7177-db9e-4e73-a7b4-ccf30d1538fb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"841f7177-db9e-4e73-a7b4-ccf30d1538fb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.721036 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/841f7177-db9e-4e73-a7b4-ccf30d1538fb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"841f7177-db9e-4e73-a7b4-ccf30d1538fb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.724021 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/841f7177-db9e-4e73-a7b4-ccf30d1538fb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"841f7177-db9e-4e73-a7b4-ccf30d1538fb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.724413 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/841f7177-db9e-4e73-a7b4-ccf30d1538fb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"841f7177-db9e-4e73-a7b4-ccf30d1538fb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.724646 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/841f7177-db9e-4e73-a7b4-ccf30d1538fb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"841f7177-db9e-4e73-a7b4-ccf30d1538fb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.724795 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/841f7177-db9e-4e73-a7b4-ccf30d1538fb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"841f7177-db9e-4e73-a7b4-ccf30d1538fb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.747156 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbb8h\" (UniqueName: \"kubernetes.io/projected/841f7177-db9e-4e73-a7b4-ccf30d1538fb-kube-api-access-nbb8h\") pod \"nova-cell1-novncproxy-0\" (UID: \"841f7177-db9e-4e73-a7b4-ccf30d1538fb\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.814979 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.820889 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.827396 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.829815 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.834931 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.835118 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.835178 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.835271 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.848420 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.923762 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.923809 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.923927 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-272jq\" (UniqueName: \"kubernetes.io/projected/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-kube-api-access-272jq\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.923981 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-scripts\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.924004 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-run-httpd\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.924151 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.924241 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-log-httpd\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:28 crc kubenswrapper[4786]: I1002 07:03:28.924274 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-config-data\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.025646 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-config-data\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.025789 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.025818 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.025894 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-272jq\" (UniqueName: \"kubernetes.io/projected/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-kube-api-access-272jq\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.025935 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-scripts\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.025952 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-run-httpd\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.025970 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.026041 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-log-httpd\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.026411 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-log-httpd\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.026476 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-run-httpd\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.029235 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.030032 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.030656 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-scripts\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.030854 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-config-data\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.031085 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.039665 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-272jq\" (UniqueName: \"kubernetes.io/projected/2bebe6de-020d-4c9d-b4ea-a3069dace1c8-kube-api-access-272jq\") pod \"ceilometer-0\" (UID: \"2bebe6de-020d-4c9d-b4ea-a3069dace1c8\") " pod="openstack/ceilometer-0" Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.146518 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.206673 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 07:03:29 crc kubenswrapper[4786]: W1002 07:03:29.208808 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod841f7177_db9e_4e73_a7b4_ccf30d1538fb.slice/crio-56374c7bfeb4fb10b90f54e5279d10e8d1482704231cbcd9468a4aa3bfcd2954 WatchSource:0}: Error finding container 56374c7bfeb4fb10b90f54e5279d10e8d1482704231cbcd9468a4aa3bfcd2954: Status 404 returned error can't find the container with id 56374c7bfeb4fb10b90f54e5279d10e8d1482704231cbcd9468a4aa3bfcd2954 Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.508136 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"841f7177-db9e-4e73-a7b4-ccf30d1538fb","Type":"ContainerStarted","Data":"65d210473ddc73fb11bd62724e00a8eba5ae8f458d02e321feeb54dab54bee84"} Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.508181 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"841f7177-db9e-4e73-a7b4-ccf30d1538fb","Type":"ContainerStarted","Data":"56374c7bfeb4fb10b90f54e5279d10e8d1482704231cbcd9468a4aa3bfcd2954"} Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.526716 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.530770 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.53076105 podStartE2EDuration="1.53076105s" podCreationTimestamp="2025-10-02 07:03:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:03:29.526825311 +0000 UTC m=+1019.648008452" watchObservedRunningTime="2025-10-02 07:03:29.53076105 +0000 UTC m=+1019.651944181" Oct 02 07:03:29 crc kubenswrapper[4786]: I1002 07:03:29.931758 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.044278 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e85150e-5e6c-4676-8c0e-ed72db605bc0-combined-ca-bundle\") pod \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\" (UID: \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\") " Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.044366 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcd77\" (UniqueName: \"kubernetes.io/projected/3e85150e-5e6c-4676-8c0e-ed72db605bc0-kube-api-access-hcd77\") pod \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\" (UID: \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\") " Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.044417 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e85150e-5e6c-4676-8c0e-ed72db605bc0-logs\") pod \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\" (UID: \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\") " Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.044572 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e85150e-5e6c-4676-8c0e-ed72db605bc0-config-data\") pod \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\" (UID: \"3e85150e-5e6c-4676-8c0e-ed72db605bc0\") " Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.044815 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e85150e-5e6c-4676-8c0e-ed72db605bc0-logs" (OuterVolumeSpecName: "logs") pod "3e85150e-5e6c-4676-8c0e-ed72db605bc0" (UID: "3e85150e-5e6c-4676-8c0e-ed72db605bc0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.045160 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e85150e-5e6c-4676-8c0e-ed72db605bc0-logs\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.048386 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e85150e-5e6c-4676-8c0e-ed72db605bc0-kube-api-access-hcd77" (OuterVolumeSpecName: "kube-api-access-hcd77") pod "3e85150e-5e6c-4676-8c0e-ed72db605bc0" (UID: "3e85150e-5e6c-4676-8c0e-ed72db605bc0"). InnerVolumeSpecName "kube-api-access-hcd77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.064542 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e85150e-5e6c-4676-8c0e-ed72db605bc0-config-data" (OuterVolumeSpecName: "config-data") pod "3e85150e-5e6c-4676-8c0e-ed72db605bc0" (UID: "3e85150e-5e6c-4676-8c0e-ed72db605bc0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.066203 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e85150e-5e6c-4676-8c0e-ed72db605bc0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e85150e-5e6c-4676-8c0e-ed72db605bc0" (UID: "3e85150e-5e6c-4676-8c0e-ed72db605bc0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.146571 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e85150e-5e6c-4676-8c0e-ed72db605bc0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.146595 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcd77\" (UniqueName: \"kubernetes.io/projected/3e85150e-5e6c-4676-8c0e-ed72db605bc0-kube-api-access-hcd77\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.146607 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e85150e-5e6c-4676-8c0e-ed72db605bc0-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.188013 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a512766-0b0f-4fff-8499-b6d42bc6facf" path="/var/lib/kubelet/pods/2a512766-0b0f-4fff-8499-b6d42bc6facf/volumes" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.191750 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24" path="/var/lib/kubelet/pods/a80e1e53-545c-4e0c-a9f1-07cb2f3b7f24/volumes" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.515876 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bebe6de-020d-4c9d-b4ea-a3069dace1c8","Type":"ContainerStarted","Data":"3e9d04e3986243255ecaab7481af38af0ef160a01ed6fa0f98294a566926bb11"} Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.516084 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bebe6de-020d-4c9d-b4ea-a3069dace1c8","Type":"ContainerStarted","Data":"37f4c7574d28ff16b1862d02fed26befdcadf9518d0980064ae92fe1f0224548"} Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.517169 4786 generic.go:334] "Generic (PLEG): container finished" podID="3e85150e-5e6c-4676-8c0e-ed72db605bc0" containerID="e86c3ba991ee23e50aa4f16eb8a1014ca0fdb2d83ba5407fd4ac6df316fc9542" exitCode=0 Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.517807 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.518199 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e85150e-5e6c-4676-8c0e-ed72db605bc0","Type":"ContainerDied","Data":"e86c3ba991ee23e50aa4f16eb8a1014ca0fdb2d83ba5407fd4ac6df316fc9542"} Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.518221 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3e85150e-5e6c-4676-8c0e-ed72db605bc0","Type":"ContainerDied","Data":"9154fe59e610a917704d6f03410c8175a5d7dc15f8c1b05cb36ffb8b25c80303"} Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.518236 4786 scope.go:117] "RemoveContainer" containerID="e86c3ba991ee23e50aa4f16eb8a1014ca0fdb2d83ba5407fd4ac6df316fc9542" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.535107 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.539487 4786 scope.go:117] "RemoveContainer" containerID="48c7e7fc5707caa14b320272d7ef7901ccd001bc7434c05a48cf01952952e9d4" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.543194 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.565234 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 07:03:30 crc kubenswrapper[4786]: E1002 07:03:30.565556 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e85150e-5e6c-4676-8c0e-ed72db605bc0" containerName="nova-api-log" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.565568 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e85150e-5e6c-4676-8c0e-ed72db605bc0" containerName="nova-api-log" Oct 02 07:03:30 crc kubenswrapper[4786]: E1002 07:03:30.565581 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e85150e-5e6c-4676-8c0e-ed72db605bc0" containerName="nova-api-api" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.565587 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e85150e-5e6c-4676-8c0e-ed72db605bc0" containerName="nova-api-api" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.565775 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e85150e-5e6c-4676-8c0e-ed72db605bc0" containerName="nova-api-log" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.565803 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e85150e-5e6c-4676-8c0e-ed72db605bc0" containerName="nova-api-api" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.567037 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.568161 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.568414 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.569205 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.570273 4786 scope.go:117] "RemoveContainer" containerID="e86c3ba991ee23e50aa4f16eb8a1014ca0fdb2d83ba5407fd4ac6df316fc9542" Oct 02 07:03:30 crc kubenswrapper[4786]: E1002 07:03:30.572725 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e86c3ba991ee23e50aa4f16eb8a1014ca0fdb2d83ba5407fd4ac6df316fc9542\": container with ID starting with e86c3ba991ee23e50aa4f16eb8a1014ca0fdb2d83ba5407fd4ac6df316fc9542 not found: ID does not exist" containerID="e86c3ba991ee23e50aa4f16eb8a1014ca0fdb2d83ba5407fd4ac6df316fc9542" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.572792 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e86c3ba991ee23e50aa4f16eb8a1014ca0fdb2d83ba5407fd4ac6df316fc9542"} err="failed to get container status \"e86c3ba991ee23e50aa4f16eb8a1014ca0fdb2d83ba5407fd4ac6df316fc9542\": rpc error: code = NotFound desc = could not find container \"e86c3ba991ee23e50aa4f16eb8a1014ca0fdb2d83ba5407fd4ac6df316fc9542\": container with ID starting with e86c3ba991ee23e50aa4f16eb8a1014ca0fdb2d83ba5407fd4ac6df316fc9542 not found: ID does not exist" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.572817 4786 scope.go:117] "RemoveContainer" containerID="48c7e7fc5707caa14b320272d7ef7901ccd001bc7434c05a48cf01952952e9d4" Oct 02 07:03:30 crc kubenswrapper[4786]: E1002 07:03:30.573980 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48c7e7fc5707caa14b320272d7ef7901ccd001bc7434c05a48cf01952952e9d4\": container with ID starting with 48c7e7fc5707caa14b320272d7ef7901ccd001bc7434c05a48cf01952952e9d4 not found: ID does not exist" containerID="48c7e7fc5707caa14b320272d7ef7901ccd001bc7434c05a48cf01952952e9d4" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.574021 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48c7e7fc5707caa14b320272d7ef7901ccd001bc7434c05a48cf01952952e9d4"} err="failed to get container status \"48c7e7fc5707caa14b320272d7ef7901ccd001bc7434c05a48cf01952952e9d4\": rpc error: code = NotFound desc = could not find container \"48c7e7fc5707caa14b320272d7ef7901ccd001bc7434c05a48cf01952952e9d4\": container with ID starting with 48c7e7fc5707caa14b320272d7ef7901ccd001bc7434c05a48cf01952952e9d4 not found: ID does not exist" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.584378 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.654318 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.654534 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjl4n\" (UniqueName: \"kubernetes.io/projected/c8257d11-57cc-46e7-9f55-9c9120621466-kube-api-access-gjl4n\") pod \"nova-api-0\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.654658 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8257d11-57cc-46e7-9f55-9c9120621466-logs\") pod \"nova-api-0\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.654807 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-public-tls-certs\") pod \"nova-api-0\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.654983 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.655051 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-config-data\") pod \"nova-api-0\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.756727 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.756903 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjl4n\" (UniqueName: \"kubernetes.io/projected/c8257d11-57cc-46e7-9f55-9c9120621466-kube-api-access-gjl4n\") pod \"nova-api-0\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.757043 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8257d11-57cc-46e7-9f55-9c9120621466-logs\") pod \"nova-api-0\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.757116 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-public-tls-certs\") pod \"nova-api-0\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.757250 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.757322 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-config-data\") pod \"nova-api-0\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.757422 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8257d11-57cc-46e7-9f55-9c9120621466-logs\") pod \"nova-api-0\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.760270 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.760758 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-config-data\") pod \"nova-api-0\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.761458 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-public-tls-certs\") pod \"nova-api-0\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.769042 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.770400 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjl4n\" (UniqueName: \"kubernetes.io/projected/c8257d11-57cc-46e7-9f55-9c9120621466-kube-api-access-gjl4n\") pod \"nova-api-0\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " pod="openstack/nova-api-0" Oct 02 07:03:30 crc kubenswrapper[4786]: I1002 07:03:30.896110 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 07:03:31 crc kubenswrapper[4786]: I1002 07:03:31.262905 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 07:03:31 crc kubenswrapper[4786]: W1002 07:03:31.263380 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8257d11_57cc_46e7_9f55_9c9120621466.slice/crio-f1a2004539ba46242ce43d824259c9ff1488083711e06e347f60ce12a66ac5ab WatchSource:0}: Error finding container f1a2004539ba46242ce43d824259c9ff1488083711e06e347f60ce12a66ac5ab: Status 404 returned error can't find the container with id f1a2004539ba46242ce43d824259c9ff1488083711e06e347f60ce12a66ac5ab Oct 02 07:03:31 crc kubenswrapper[4786]: I1002 07:03:31.527127 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bebe6de-020d-4c9d-b4ea-a3069dace1c8","Type":"ContainerStarted","Data":"d022409865ca76ec03e687160f044f7cdef9f2a2cc30d74aa9ba6e62289e5597"} Oct 02 07:03:31 crc kubenswrapper[4786]: I1002 07:03:31.528614 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c8257d11-57cc-46e7-9f55-9c9120621466","Type":"ContainerStarted","Data":"0ada40c26e05c48c7ac6f4e15bfe8e9499be2f8841e21c5783d6802c4e3f4b31"} Oct 02 07:03:31 crc kubenswrapper[4786]: I1002 07:03:31.528649 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c8257d11-57cc-46e7-9f55-9c9120621466","Type":"ContainerStarted","Data":"a45c40fa91b52ecb738ac0377d5b2ae1d7ec4160e3f9ec586f80a518748d44f1"} Oct 02 07:03:31 crc kubenswrapper[4786]: I1002 07:03:31.528660 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c8257d11-57cc-46e7-9f55-9c9120621466","Type":"ContainerStarted","Data":"f1a2004539ba46242ce43d824259c9ff1488083711e06e347f60ce12a66ac5ab"} Oct 02 07:03:32 crc kubenswrapper[4786]: E1002 07:03:32.133271 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e85150e_5e6c_4676_8c0e_ed72db605bc0.slice\": RecentStats: unable to find data in memory cache]" Oct 02 07:03:32 crc kubenswrapper[4786]: I1002 07:03:32.187972 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e85150e-5e6c-4676-8c0e-ed72db605bc0" path="/var/lib/kubelet/pods/3e85150e-5e6c-4676-8c0e-ed72db605bc0/volumes" Oct 02 07:03:32 crc kubenswrapper[4786]: I1002 07:03:32.536788 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bebe6de-020d-4c9d-b4ea-a3069dace1c8","Type":"ContainerStarted","Data":"36baa7e1434dd9a415bc359d7ebd7fd192cb17df027e71ae119d6bb272dba520"} Oct 02 07:03:33 crc kubenswrapper[4786]: I1002 07:03:33.546927 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bebe6de-020d-4c9d-b4ea-a3069dace1c8","Type":"ContainerStarted","Data":"cfae5728ed4d760fb427a1c30408cf846c581b9c9eecfbf3f3026fa67cba918d"} Oct 02 07:03:33 crc kubenswrapper[4786]: I1002 07:03:33.548138 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 07:03:33 crc kubenswrapper[4786]: I1002 07:03:33.562802 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.562790485 podStartE2EDuration="3.562790485s" podCreationTimestamp="2025-10-02 07:03:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:03:31.545732461 +0000 UTC m=+1021.666915602" watchObservedRunningTime="2025-10-02 07:03:33.562790485 +0000 UTC m=+1023.683973616" Oct 02 07:03:33 crc kubenswrapper[4786]: I1002 07:03:33.563426 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.149152998 podStartE2EDuration="5.56341931s" podCreationTimestamp="2025-10-02 07:03:28 +0000 UTC" firstStartedPulling="2025-10-02 07:03:29.533167336 +0000 UTC m=+1019.654350467" lastFinishedPulling="2025-10-02 07:03:32.947433647 +0000 UTC m=+1023.068616779" observedRunningTime="2025-10-02 07:03:33.562247762 +0000 UTC m=+1023.683430903" watchObservedRunningTime="2025-10-02 07:03:33.56341931 +0000 UTC m=+1023.684602441" Oct 02 07:03:33 crc kubenswrapper[4786]: I1002 07:03:33.849460 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:33 crc kubenswrapper[4786]: I1002 07:03:33.869751 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:03:33 crc kubenswrapper[4786]: I1002 07:03:33.912507 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d65b9d95f-ktpgz"] Oct 02 07:03:33 crc kubenswrapper[4786]: I1002 07:03:33.912769 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" podUID="a94b58fb-29b4-4b36-9229-748bb7705f42" containerName="dnsmasq-dns" containerID="cri-o://bf6f200473e8d3242febc8646ed4764bc61bc89e058b9f4a8bcde3a255e34144" gracePeriod=10 Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.355968 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.517733 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-dns-swift-storage-0\") pod \"a94b58fb-29b4-4b36-9229-748bb7705f42\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.517903 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-dns-svc\") pod \"a94b58fb-29b4-4b36-9229-748bb7705f42\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.517929 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-ovsdbserver-nb\") pod \"a94b58fb-29b4-4b36-9229-748bb7705f42\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.518015 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nc5d\" (UniqueName: \"kubernetes.io/projected/a94b58fb-29b4-4b36-9229-748bb7705f42-kube-api-access-7nc5d\") pod \"a94b58fb-29b4-4b36-9229-748bb7705f42\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.518048 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-ovsdbserver-sb\") pod \"a94b58fb-29b4-4b36-9229-748bb7705f42\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.518065 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-config\") pod \"a94b58fb-29b4-4b36-9229-748bb7705f42\" (UID: \"a94b58fb-29b4-4b36-9229-748bb7705f42\") " Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.524425 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a94b58fb-29b4-4b36-9229-748bb7705f42-kube-api-access-7nc5d" (OuterVolumeSpecName: "kube-api-access-7nc5d") pod "a94b58fb-29b4-4b36-9229-748bb7705f42" (UID: "a94b58fb-29b4-4b36-9229-748bb7705f42"). InnerVolumeSpecName "kube-api-access-7nc5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.558502 4786 generic.go:334] "Generic (PLEG): container finished" podID="a94b58fb-29b4-4b36-9229-748bb7705f42" containerID="bf6f200473e8d3242febc8646ed4764bc61bc89e058b9f4a8bcde3a255e34144" exitCode=0 Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.558770 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a94b58fb-29b4-4b36-9229-748bb7705f42" (UID: "a94b58fb-29b4-4b36-9229-748bb7705f42"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.558890 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.558872 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" event={"ID":"a94b58fb-29b4-4b36-9229-748bb7705f42","Type":"ContainerDied","Data":"bf6f200473e8d3242febc8646ed4764bc61bc89e058b9f4a8bcde3a255e34144"} Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.559048 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d65b9d95f-ktpgz" event={"ID":"a94b58fb-29b4-4b36-9229-748bb7705f42","Type":"ContainerDied","Data":"f5c329dda923382eafe51e1416084e6e170b5921e0755be4ae2332ee3f05819d"} Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.559077 4786 scope.go:117] "RemoveContainer" containerID="bf6f200473e8d3242febc8646ed4764bc61bc89e058b9f4a8bcde3a255e34144" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.567229 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-config" (OuterVolumeSpecName: "config") pod "a94b58fb-29b4-4b36-9229-748bb7705f42" (UID: "a94b58fb-29b4-4b36-9229-748bb7705f42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.571737 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a94b58fb-29b4-4b36-9229-748bb7705f42" (UID: "a94b58fb-29b4-4b36-9229-748bb7705f42"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.578292 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a94b58fb-29b4-4b36-9229-748bb7705f42" (UID: "a94b58fb-29b4-4b36-9229-748bb7705f42"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.587106 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a94b58fb-29b4-4b36-9229-748bb7705f42" (UID: "a94b58fb-29b4-4b36-9229-748bb7705f42"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.619360 4786 scope.go:117] "RemoveContainer" containerID="f80e77db3f3238797f62a3409b81cf43d8e34b97ebf3a3f8b9ec1fb1c4012eb2" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.620502 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.620533 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.620545 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.620558 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nc5d\" (UniqueName: \"kubernetes.io/projected/a94b58fb-29b4-4b36-9229-748bb7705f42-kube-api-access-7nc5d\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.620568 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.620578 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a94b58fb-29b4-4b36-9229-748bb7705f42-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.636562 4786 scope.go:117] "RemoveContainer" containerID="bf6f200473e8d3242febc8646ed4764bc61bc89e058b9f4a8bcde3a255e34144" Oct 02 07:03:34 crc kubenswrapper[4786]: E1002 07:03:34.636904 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf6f200473e8d3242febc8646ed4764bc61bc89e058b9f4a8bcde3a255e34144\": container with ID starting with bf6f200473e8d3242febc8646ed4764bc61bc89e058b9f4a8bcde3a255e34144 not found: ID does not exist" containerID="bf6f200473e8d3242febc8646ed4764bc61bc89e058b9f4a8bcde3a255e34144" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.636933 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf6f200473e8d3242febc8646ed4764bc61bc89e058b9f4a8bcde3a255e34144"} err="failed to get container status \"bf6f200473e8d3242febc8646ed4764bc61bc89e058b9f4a8bcde3a255e34144\": rpc error: code = NotFound desc = could not find container \"bf6f200473e8d3242febc8646ed4764bc61bc89e058b9f4a8bcde3a255e34144\": container with ID starting with bf6f200473e8d3242febc8646ed4764bc61bc89e058b9f4a8bcde3a255e34144 not found: ID does not exist" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.636956 4786 scope.go:117] "RemoveContainer" containerID="f80e77db3f3238797f62a3409b81cf43d8e34b97ebf3a3f8b9ec1fb1c4012eb2" Oct 02 07:03:34 crc kubenswrapper[4786]: E1002 07:03:34.637272 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f80e77db3f3238797f62a3409b81cf43d8e34b97ebf3a3f8b9ec1fb1c4012eb2\": container with ID starting with f80e77db3f3238797f62a3409b81cf43d8e34b97ebf3a3f8b9ec1fb1c4012eb2 not found: ID does not exist" containerID="f80e77db3f3238797f62a3409b81cf43d8e34b97ebf3a3f8b9ec1fb1c4012eb2" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.637295 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80e77db3f3238797f62a3409b81cf43d8e34b97ebf3a3f8b9ec1fb1c4012eb2"} err="failed to get container status \"f80e77db3f3238797f62a3409b81cf43d8e34b97ebf3a3f8b9ec1fb1c4012eb2\": rpc error: code = NotFound desc = could not find container \"f80e77db3f3238797f62a3409b81cf43d8e34b97ebf3a3f8b9ec1fb1c4012eb2\": container with ID starting with f80e77db3f3238797f62a3409b81cf43d8e34b97ebf3a3f8b9ec1fb1c4012eb2 not found: ID does not exist" Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.885669 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d65b9d95f-ktpgz"] Oct 02 07:03:34 crc kubenswrapper[4786]: I1002 07:03:34.892665 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d65b9d95f-ktpgz"] Oct 02 07:03:36 crc kubenswrapper[4786]: I1002 07:03:36.190041 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a94b58fb-29b4-4b36-9229-748bb7705f42" path="/var/lib/kubelet/pods/a94b58fb-29b4-4b36-9229-748bb7705f42/volumes" Oct 02 07:03:38 crc kubenswrapper[4786]: I1002 07:03:38.849782 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:38 crc kubenswrapper[4786]: I1002 07:03:38.864966 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.608146 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.702906 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-c974n"] Oct 02 07:03:39 crc kubenswrapper[4786]: E1002 07:03:39.703209 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94b58fb-29b4-4b36-9229-748bb7705f42" containerName="init" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.703227 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94b58fb-29b4-4b36-9229-748bb7705f42" containerName="init" Oct 02 07:03:39 crc kubenswrapper[4786]: E1002 07:03:39.703237 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94b58fb-29b4-4b36-9229-748bb7705f42" containerName="dnsmasq-dns" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.703242 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94b58fb-29b4-4b36-9229-748bb7705f42" containerName="dnsmasq-dns" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.703431 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a94b58fb-29b4-4b36-9229-748bb7705f42" containerName="dnsmasq-dns" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.703949 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-c974n" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.705571 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.711399 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.715369 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-c974n"] Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.799896 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/181f9506-2b83-4815-b417-a5b3eff7d763-scripts\") pod \"nova-cell1-cell-mapping-c974n\" (UID: \"181f9506-2b83-4815-b417-a5b3eff7d763\") " pod="openstack/nova-cell1-cell-mapping-c974n" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.799966 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/181f9506-2b83-4815-b417-a5b3eff7d763-config-data\") pod \"nova-cell1-cell-mapping-c974n\" (UID: \"181f9506-2b83-4815-b417-a5b3eff7d763\") " pod="openstack/nova-cell1-cell-mapping-c974n" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.799991 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/181f9506-2b83-4815-b417-a5b3eff7d763-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-c974n\" (UID: \"181f9506-2b83-4815-b417-a5b3eff7d763\") " pod="openstack/nova-cell1-cell-mapping-c974n" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.800180 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndwg2\" (UniqueName: \"kubernetes.io/projected/181f9506-2b83-4815-b417-a5b3eff7d763-kube-api-access-ndwg2\") pod \"nova-cell1-cell-mapping-c974n\" (UID: \"181f9506-2b83-4815-b417-a5b3eff7d763\") " pod="openstack/nova-cell1-cell-mapping-c974n" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.902077 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndwg2\" (UniqueName: \"kubernetes.io/projected/181f9506-2b83-4815-b417-a5b3eff7d763-kube-api-access-ndwg2\") pod \"nova-cell1-cell-mapping-c974n\" (UID: \"181f9506-2b83-4815-b417-a5b3eff7d763\") " pod="openstack/nova-cell1-cell-mapping-c974n" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.902229 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/181f9506-2b83-4815-b417-a5b3eff7d763-scripts\") pod \"nova-cell1-cell-mapping-c974n\" (UID: \"181f9506-2b83-4815-b417-a5b3eff7d763\") " pod="openstack/nova-cell1-cell-mapping-c974n" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.902290 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/181f9506-2b83-4815-b417-a5b3eff7d763-config-data\") pod \"nova-cell1-cell-mapping-c974n\" (UID: \"181f9506-2b83-4815-b417-a5b3eff7d763\") " pod="openstack/nova-cell1-cell-mapping-c974n" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.902313 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/181f9506-2b83-4815-b417-a5b3eff7d763-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-c974n\" (UID: \"181f9506-2b83-4815-b417-a5b3eff7d763\") " pod="openstack/nova-cell1-cell-mapping-c974n" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.906977 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/181f9506-2b83-4815-b417-a5b3eff7d763-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-c974n\" (UID: \"181f9506-2b83-4815-b417-a5b3eff7d763\") " pod="openstack/nova-cell1-cell-mapping-c974n" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.907048 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/181f9506-2b83-4815-b417-a5b3eff7d763-scripts\") pod \"nova-cell1-cell-mapping-c974n\" (UID: \"181f9506-2b83-4815-b417-a5b3eff7d763\") " pod="openstack/nova-cell1-cell-mapping-c974n" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.907577 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/181f9506-2b83-4815-b417-a5b3eff7d763-config-data\") pod \"nova-cell1-cell-mapping-c974n\" (UID: \"181f9506-2b83-4815-b417-a5b3eff7d763\") " pod="openstack/nova-cell1-cell-mapping-c974n" Oct 02 07:03:39 crc kubenswrapper[4786]: I1002 07:03:39.915915 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndwg2\" (UniqueName: \"kubernetes.io/projected/181f9506-2b83-4815-b417-a5b3eff7d763-kube-api-access-ndwg2\") pod \"nova-cell1-cell-mapping-c974n\" (UID: \"181f9506-2b83-4815-b417-a5b3eff7d763\") " pod="openstack/nova-cell1-cell-mapping-c974n" Oct 02 07:03:40 crc kubenswrapper[4786]: I1002 07:03:40.024399 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-c974n" Oct 02 07:03:40 crc kubenswrapper[4786]: I1002 07:03:40.402939 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-c974n"] Oct 02 07:03:40 crc kubenswrapper[4786]: I1002 07:03:40.603038 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-c974n" event={"ID":"181f9506-2b83-4815-b417-a5b3eff7d763","Type":"ContainerStarted","Data":"b3bf505f62f0b9ea0997cd3c970ad0fc9c924141a6a43b7111eeb1beed5a8977"} Oct 02 07:03:40 crc kubenswrapper[4786]: I1002 07:03:40.603264 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-c974n" event={"ID":"181f9506-2b83-4815-b417-a5b3eff7d763","Type":"ContainerStarted","Data":"3f7a15ee269a6ef6bd2d8443848af3e0486bbff9a02e02a94ae50590efd49015"} Oct 02 07:03:40 crc kubenswrapper[4786]: I1002 07:03:40.618497 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-c974n" podStartSLOduration=1.6184810010000001 podStartE2EDuration="1.618481001s" podCreationTimestamp="2025-10-02 07:03:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:03:40.612495858 +0000 UTC m=+1030.733678999" watchObservedRunningTime="2025-10-02 07:03:40.618481001 +0000 UTC m=+1030.739664122" Oct 02 07:03:40 crc kubenswrapper[4786]: I1002 07:03:40.896994 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 07:03:40 crc kubenswrapper[4786]: I1002 07:03:40.897032 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 07:03:41 crc kubenswrapper[4786]: I1002 07:03:41.909816 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c8257d11-57cc-46e7-9f55-9c9120621466" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.196:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 07:03:41 crc kubenswrapper[4786]: I1002 07:03:41.909886 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c8257d11-57cc-46e7-9f55-9c9120621466" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 07:03:42 crc kubenswrapper[4786]: E1002 07:03:42.345416 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e85150e_5e6c_4676_8c0e_ed72db605bc0.slice\": RecentStats: unable to find data in memory cache]" Oct 02 07:03:44 crc kubenswrapper[4786]: I1002 07:03:44.631816 4786 generic.go:334] "Generic (PLEG): container finished" podID="181f9506-2b83-4815-b417-a5b3eff7d763" containerID="b3bf505f62f0b9ea0997cd3c970ad0fc9c924141a6a43b7111eeb1beed5a8977" exitCode=0 Oct 02 07:03:44 crc kubenswrapper[4786]: I1002 07:03:44.631893 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-c974n" event={"ID":"181f9506-2b83-4815-b417-a5b3eff7d763","Type":"ContainerDied","Data":"b3bf505f62f0b9ea0997cd3c970ad0fc9c924141a6a43b7111eeb1beed5a8977"} Oct 02 07:03:45 crc kubenswrapper[4786]: I1002 07:03:45.888652 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-c974n" Oct 02 07:03:45 crc kubenswrapper[4786]: I1002 07:03:45.999784 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndwg2\" (UniqueName: \"kubernetes.io/projected/181f9506-2b83-4815-b417-a5b3eff7d763-kube-api-access-ndwg2\") pod \"181f9506-2b83-4815-b417-a5b3eff7d763\" (UID: \"181f9506-2b83-4815-b417-a5b3eff7d763\") " Oct 02 07:03:45 crc kubenswrapper[4786]: I1002 07:03:45.999869 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/181f9506-2b83-4815-b417-a5b3eff7d763-config-data\") pod \"181f9506-2b83-4815-b417-a5b3eff7d763\" (UID: \"181f9506-2b83-4815-b417-a5b3eff7d763\") " Oct 02 07:03:45 crc kubenswrapper[4786]: I1002 07:03:45.999965 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/181f9506-2b83-4815-b417-a5b3eff7d763-scripts\") pod \"181f9506-2b83-4815-b417-a5b3eff7d763\" (UID: \"181f9506-2b83-4815-b417-a5b3eff7d763\") " Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.000072 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/181f9506-2b83-4815-b417-a5b3eff7d763-combined-ca-bundle\") pod \"181f9506-2b83-4815-b417-a5b3eff7d763\" (UID: \"181f9506-2b83-4815-b417-a5b3eff7d763\") " Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.003990 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/181f9506-2b83-4815-b417-a5b3eff7d763-scripts" (OuterVolumeSpecName: "scripts") pod "181f9506-2b83-4815-b417-a5b3eff7d763" (UID: "181f9506-2b83-4815-b417-a5b3eff7d763"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.004096 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/181f9506-2b83-4815-b417-a5b3eff7d763-kube-api-access-ndwg2" (OuterVolumeSpecName: "kube-api-access-ndwg2") pod "181f9506-2b83-4815-b417-a5b3eff7d763" (UID: "181f9506-2b83-4815-b417-a5b3eff7d763"). InnerVolumeSpecName "kube-api-access-ndwg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.019716 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/181f9506-2b83-4815-b417-a5b3eff7d763-config-data" (OuterVolumeSpecName: "config-data") pod "181f9506-2b83-4815-b417-a5b3eff7d763" (UID: "181f9506-2b83-4815-b417-a5b3eff7d763"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.019933 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/181f9506-2b83-4815-b417-a5b3eff7d763-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "181f9506-2b83-4815-b417-a5b3eff7d763" (UID: "181f9506-2b83-4815-b417-a5b3eff7d763"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.101739 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/181f9506-2b83-4815-b417-a5b3eff7d763-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.101763 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndwg2\" (UniqueName: \"kubernetes.io/projected/181f9506-2b83-4815-b417-a5b3eff7d763-kube-api-access-ndwg2\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.101775 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/181f9506-2b83-4815-b417-a5b3eff7d763-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.101783 4786 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/181f9506-2b83-4815-b417-a5b3eff7d763-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.645891 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-c974n" event={"ID":"181f9506-2b83-4815-b417-a5b3eff7d763","Type":"ContainerDied","Data":"3f7a15ee269a6ef6bd2d8443848af3e0486bbff9a02e02a94ae50590efd49015"} Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.646098 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f7a15ee269a6ef6bd2d8443848af3e0486bbff9a02e02a94ae50590efd49015" Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.645927 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-c974n" Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.807909 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.808155 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c8257d11-57cc-46e7-9f55-9c9120621466" containerName="nova-api-log" containerID="cri-o://a45c40fa91b52ecb738ac0377d5b2ae1d7ec4160e3f9ec586f80a518748d44f1" gracePeriod=30 Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.808264 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c8257d11-57cc-46e7-9f55-9c9120621466" containerName="nova-api-api" containerID="cri-o://0ada40c26e05c48c7ac6f4e15bfe8e9499be2f8841e21c5783d6802c4e3f4b31" gracePeriod=30 Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.823555 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.823768 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="729649ff-3385-42e4-a708-b174e19abe33" containerName="nova-scheduler-scheduler" containerID="cri-o://bb576784bbebf1fc22365eaba90a170972dc723fbf45cd2777eca641ecad8679" gracePeriod=30 Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.830762 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.830969 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="543ff50b-2b00-4425-9842-6a02a6b5d3c5" containerName="nova-metadata-log" containerID="cri-o://4cf059fb2a370cf51dcfb3f92f5dc869b3fb3751f2d025e67d6289d19499d638" gracePeriod=30 Oct 02 07:03:46 crc kubenswrapper[4786]: I1002 07:03:46.831047 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="543ff50b-2b00-4425-9842-6a02a6b5d3c5" containerName="nova-metadata-metadata" containerID="cri-o://5a2578412427231cd0a7254ca8d16d3a4fc8cd61194df2720a46540a928db8d4" gracePeriod=30 Oct 02 07:03:47 crc kubenswrapper[4786]: I1002 07:03:47.654462 4786 generic.go:334] "Generic (PLEG): container finished" podID="c8257d11-57cc-46e7-9f55-9c9120621466" containerID="a45c40fa91b52ecb738ac0377d5b2ae1d7ec4160e3f9ec586f80a518748d44f1" exitCode=143 Oct 02 07:03:47 crc kubenswrapper[4786]: I1002 07:03:47.654544 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c8257d11-57cc-46e7-9f55-9c9120621466","Type":"ContainerDied","Data":"a45c40fa91b52ecb738ac0377d5b2ae1d7ec4160e3f9ec586f80a518748d44f1"} Oct 02 07:03:47 crc kubenswrapper[4786]: I1002 07:03:47.658132 4786 generic.go:334] "Generic (PLEG): container finished" podID="729649ff-3385-42e4-a708-b174e19abe33" containerID="bb576784bbebf1fc22365eaba90a170972dc723fbf45cd2777eca641ecad8679" exitCode=0 Oct 02 07:03:47 crc kubenswrapper[4786]: I1002 07:03:47.658203 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"729649ff-3385-42e4-a708-b174e19abe33","Type":"ContainerDied","Data":"bb576784bbebf1fc22365eaba90a170972dc723fbf45cd2777eca641ecad8679"} Oct 02 07:03:47 crc kubenswrapper[4786]: I1002 07:03:47.661155 4786 generic.go:334] "Generic (PLEG): container finished" podID="543ff50b-2b00-4425-9842-6a02a6b5d3c5" containerID="4cf059fb2a370cf51dcfb3f92f5dc869b3fb3751f2d025e67d6289d19499d638" exitCode=143 Oct 02 07:03:47 crc kubenswrapper[4786]: I1002 07:03:47.661192 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"543ff50b-2b00-4425-9842-6a02a6b5d3c5","Type":"ContainerDied","Data":"4cf059fb2a370cf51dcfb3f92f5dc869b3fb3751f2d025e67d6289d19499d638"} Oct 02 07:03:47 crc kubenswrapper[4786]: I1002 07:03:47.887422 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.029408 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v5nl\" (UniqueName: \"kubernetes.io/projected/729649ff-3385-42e4-a708-b174e19abe33-kube-api-access-8v5nl\") pod \"729649ff-3385-42e4-a708-b174e19abe33\" (UID: \"729649ff-3385-42e4-a708-b174e19abe33\") " Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.029631 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/729649ff-3385-42e4-a708-b174e19abe33-combined-ca-bundle\") pod \"729649ff-3385-42e4-a708-b174e19abe33\" (UID: \"729649ff-3385-42e4-a708-b174e19abe33\") " Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.029683 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/729649ff-3385-42e4-a708-b174e19abe33-config-data\") pod \"729649ff-3385-42e4-a708-b174e19abe33\" (UID: \"729649ff-3385-42e4-a708-b174e19abe33\") " Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.034482 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/729649ff-3385-42e4-a708-b174e19abe33-kube-api-access-8v5nl" (OuterVolumeSpecName: "kube-api-access-8v5nl") pod "729649ff-3385-42e4-a708-b174e19abe33" (UID: "729649ff-3385-42e4-a708-b174e19abe33"). InnerVolumeSpecName "kube-api-access-8v5nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.050005 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/729649ff-3385-42e4-a708-b174e19abe33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "729649ff-3385-42e4-a708-b174e19abe33" (UID: "729649ff-3385-42e4-a708-b174e19abe33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.050313 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/729649ff-3385-42e4-a708-b174e19abe33-config-data" (OuterVolumeSpecName: "config-data") pod "729649ff-3385-42e4-a708-b174e19abe33" (UID: "729649ff-3385-42e4-a708-b174e19abe33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.131518 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/729649ff-3385-42e4-a708-b174e19abe33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.131548 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/729649ff-3385-42e4-a708-b174e19abe33-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.131557 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v5nl\" (UniqueName: \"kubernetes.io/projected/729649ff-3385-42e4-a708-b174e19abe33-kube-api-access-8v5nl\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.669180 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"729649ff-3385-42e4-a708-b174e19abe33","Type":"ContainerDied","Data":"301866e46f4e22704afaaa60110fde4534952936e6ef653401f3ac7ae3ac0584"} Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.669214 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.669230 4786 scope.go:117] "RemoveContainer" containerID="bb576784bbebf1fc22365eaba90a170972dc723fbf45cd2777eca641ecad8679" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.684152 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.690278 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.697444 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 07:03:48 crc kubenswrapper[4786]: E1002 07:03:48.697790 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="729649ff-3385-42e4-a708-b174e19abe33" containerName="nova-scheduler-scheduler" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.697807 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="729649ff-3385-42e4-a708-b174e19abe33" containerName="nova-scheduler-scheduler" Oct 02 07:03:48 crc kubenswrapper[4786]: E1002 07:03:48.697829 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181f9506-2b83-4815-b417-a5b3eff7d763" containerName="nova-manage" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.697835 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="181f9506-2b83-4815-b417-a5b3eff7d763" containerName="nova-manage" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.698013 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="729649ff-3385-42e4-a708-b174e19abe33" containerName="nova-scheduler-scheduler" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.698032 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="181f9506-2b83-4815-b417-a5b3eff7d763" containerName="nova-manage" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.698524 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.700208 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.705779 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.841172 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/945deef9-0f62-4eeb-af6c-beb702d5458b-config-data\") pod \"nova-scheduler-0\" (UID: \"945deef9-0f62-4eeb-af6c-beb702d5458b\") " pod="openstack/nova-scheduler-0" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.841279 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945deef9-0f62-4eeb-af6c-beb702d5458b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"945deef9-0f62-4eeb-af6c-beb702d5458b\") " pod="openstack/nova-scheduler-0" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.841310 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwp6s\" (UniqueName: \"kubernetes.io/projected/945deef9-0f62-4eeb-af6c-beb702d5458b-kube-api-access-nwp6s\") pod \"nova-scheduler-0\" (UID: \"945deef9-0f62-4eeb-af6c-beb702d5458b\") " pod="openstack/nova-scheduler-0" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.943050 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/945deef9-0f62-4eeb-af6c-beb702d5458b-config-data\") pod \"nova-scheduler-0\" (UID: \"945deef9-0f62-4eeb-af6c-beb702d5458b\") " pod="openstack/nova-scheduler-0" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.943118 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945deef9-0f62-4eeb-af6c-beb702d5458b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"945deef9-0f62-4eeb-af6c-beb702d5458b\") " pod="openstack/nova-scheduler-0" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.943150 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwp6s\" (UniqueName: \"kubernetes.io/projected/945deef9-0f62-4eeb-af6c-beb702d5458b-kube-api-access-nwp6s\") pod \"nova-scheduler-0\" (UID: \"945deef9-0f62-4eeb-af6c-beb702d5458b\") " pod="openstack/nova-scheduler-0" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.947332 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945deef9-0f62-4eeb-af6c-beb702d5458b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"945deef9-0f62-4eeb-af6c-beb702d5458b\") " pod="openstack/nova-scheduler-0" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.948751 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/945deef9-0f62-4eeb-af6c-beb702d5458b-config-data\") pod \"nova-scheduler-0\" (UID: \"945deef9-0f62-4eeb-af6c-beb702d5458b\") " pod="openstack/nova-scheduler-0" Oct 02 07:03:48 crc kubenswrapper[4786]: I1002 07:03:48.955316 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwp6s\" (UniqueName: \"kubernetes.io/projected/945deef9-0f62-4eeb-af6c-beb702d5458b-kube-api-access-nwp6s\") pod \"nova-scheduler-0\" (UID: \"945deef9-0f62-4eeb-af6c-beb702d5458b\") " pod="openstack/nova-scheduler-0" Oct 02 07:03:49 crc kubenswrapper[4786]: I1002 07:03:49.013438 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 07:03:49 crc kubenswrapper[4786]: W1002 07:03:49.383284 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod945deef9_0f62_4eeb_af6c_beb702d5458b.slice/crio-93e9e7bb84bfce34ccaab611c8a0a0920e9aa70e4414a8e2da5d52e5e42d3316 WatchSource:0}: Error finding container 93e9e7bb84bfce34ccaab611c8a0a0920e9aa70e4414a8e2da5d52e5e42d3316: Status 404 returned error can't find the container with id 93e9e7bb84bfce34ccaab611c8a0a0920e9aa70e4414a8e2da5d52e5e42d3316 Oct 02 07:03:49 crc kubenswrapper[4786]: I1002 07:03:49.383558 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 07:03:49 crc kubenswrapper[4786]: I1002 07:03:49.677282 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"945deef9-0f62-4eeb-af6c-beb702d5458b","Type":"ContainerStarted","Data":"0878be5d2f7dc7f7939ce82b2e382a7fdc89035834523f2e2221d63ae122219e"} Oct 02 07:03:49 crc kubenswrapper[4786]: I1002 07:03:49.677497 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"945deef9-0f62-4eeb-af6c-beb702d5458b","Type":"ContainerStarted","Data":"93e9e7bb84bfce34ccaab611c8a0a0920e9aa70e4414a8e2da5d52e5e42d3316"} Oct 02 07:03:49 crc kubenswrapper[4786]: I1002 07:03:49.689330 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.689317486 podStartE2EDuration="1.689317486s" podCreationTimestamp="2025-10-02 07:03:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:03:49.688734216 +0000 UTC m=+1039.809917347" watchObservedRunningTime="2025-10-02 07:03:49.689317486 +0000 UTC m=+1039.810500617" Oct 02 07:03:49 crc kubenswrapper[4786]: I1002 07:03:49.949148 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="543ff50b-2b00-4425-9842-6a02a6b5d3c5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": read tcp 10.217.0.2:54406->10.217.0.190:8775: read: connection reset by peer" Oct 02 07:03:49 crc kubenswrapper[4786]: I1002 07:03:49.949836 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="543ff50b-2b00-4425-9842-6a02a6b5d3c5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.190:8775/\": read tcp 10.217.0.2:54422->10.217.0.190:8775: read: connection reset by peer" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.197570 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="729649ff-3385-42e4-a708-b174e19abe33" path="/var/lib/kubelet/pods/729649ff-3385-42e4-a708-b174e19abe33/volumes" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.373406 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.378985 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.467261 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/543ff50b-2b00-4425-9842-6a02a6b5d3c5-nova-metadata-tls-certs\") pod \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.467514 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k79gn\" (UniqueName: \"kubernetes.io/projected/543ff50b-2b00-4425-9842-6a02a6b5d3c5-kube-api-access-k79gn\") pod \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.467627 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-internal-tls-certs\") pod \"c8257d11-57cc-46e7-9f55-9c9120621466\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.467767 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8257d11-57cc-46e7-9f55-9c9120621466-logs\") pod \"c8257d11-57cc-46e7-9f55-9c9120621466\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.467880 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543ff50b-2b00-4425-9842-6a02a6b5d3c5-combined-ca-bundle\") pod \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.467959 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-combined-ca-bundle\") pod \"c8257d11-57cc-46e7-9f55-9c9120621466\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.468052 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/543ff50b-2b00-4425-9842-6a02a6b5d3c5-logs\") pod \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.468517 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjl4n\" (UniqueName: \"kubernetes.io/projected/c8257d11-57cc-46e7-9f55-9c9120621466-kube-api-access-gjl4n\") pod \"c8257d11-57cc-46e7-9f55-9c9120621466\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.468103 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8257d11-57cc-46e7-9f55-9c9120621466-logs" (OuterVolumeSpecName: "logs") pod "c8257d11-57cc-46e7-9f55-9c9120621466" (UID: "c8257d11-57cc-46e7-9f55-9c9120621466"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.468437 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/543ff50b-2b00-4425-9842-6a02a6b5d3c5-logs" (OuterVolumeSpecName: "logs") pod "543ff50b-2b00-4425-9842-6a02a6b5d3c5" (UID: "543ff50b-2b00-4425-9842-6a02a6b5d3c5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.468893 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/543ff50b-2b00-4425-9842-6a02a6b5d3c5-config-data\") pod \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\" (UID: \"543ff50b-2b00-4425-9842-6a02a6b5d3c5\") " Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.468991 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-public-tls-certs\") pod \"c8257d11-57cc-46e7-9f55-9c9120621466\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.469026 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-config-data\") pod \"c8257d11-57cc-46e7-9f55-9c9120621466\" (UID: \"c8257d11-57cc-46e7-9f55-9c9120621466\") " Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.469680 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8257d11-57cc-46e7-9f55-9c9120621466-logs\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.469713 4786 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/543ff50b-2b00-4425-9842-6a02a6b5d3c5-logs\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.486535 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8257d11-57cc-46e7-9f55-9c9120621466-kube-api-access-gjl4n" (OuterVolumeSpecName: "kube-api-access-gjl4n") pod "c8257d11-57cc-46e7-9f55-9c9120621466" (UID: "c8257d11-57cc-46e7-9f55-9c9120621466"). InnerVolumeSpecName "kube-api-access-gjl4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.487100 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/543ff50b-2b00-4425-9842-6a02a6b5d3c5-kube-api-access-k79gn" (OuterVolumeSpecName: "kube-api-access-k79gn") pod "543ff50b-2b00-4425-9842-6a02a6b5d3c5" (UID: "543ff50b-2b00-4425-9842-6a02a6b5d3c5"). InnerVolumeSpecName "kube-api-access-k79gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.494985 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/543ff50b-2b00-4425-9842-6a02a6b5d3c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "543ff50b-2b00-4425-9842-6a02a6b5d3c5" (UID: "543ff50b-2b00-4425-9842-6a02a6b5d3c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.496282 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-config-data" (OuterVolumeSpecName: "config-data") pod "c8257d11-57cc-46e7-9f55-9c9120621466" (UID: "c8257d11-57cc-46e7-9f55-9c9120621466"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.517104 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/543ff50b-2b00-4425-9842-6a02a6b5d3c5-config-data" (OuterVolumeSpecName: "config-data") pod "543ff50b-2b00-4425-9842-6a02a6b5d3c5" (UID: "543ff50b-2b00-4425-9842-6a02a6b5d3c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.518058 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/543ff50b-2b00-4425-9842-6a02a6b5d3c5-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "543ff50b-2b00-4425-9842-6a02a6b5d3c5" (UID: "543ff50b-2b00-4425-9842-6a02a6b5d3c5"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.518861 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c8257d11-57cc-46e7-9f55-9c9120621466" (UID: "c8257d11-57cc-46e7-9f55-9c9120621466"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.520631 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c8257d11-57cc-46e7-9f55-9c9120621466" (UID: "c8257d11-57cc-46e7-9f55-9c9120621466"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.527274 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8257d11-57cc-46e7-9f55-9c9120621466" (UID: "c8257d11-57cc-46e7-9f55-9c9120621466"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.571488 4786 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/543ff50b-2b00-4425-9842-6a02a6b5d3c5-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.571515 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k79gn\" (UniqueName: \"kubernetes.io/projected/543ff50b-2b00-4425-9842-6a02a6b5d3c5-kube-api-access-k79gn\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.571525 4786 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.571534 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/543ff50b-2b00-4425-9842-6a02a6b5d3c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.571542 4786 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.571572 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjl4n\" (UniqueName: \"kubernetes.io/projected/c8257d11-57cc-46e7-9f55-9c9120621466-kube-api-access-gjl4n\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.571580 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/543ff50b-2b00-4425-9842-6a02a6b5d3c5-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.571588 4786 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.571596 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8257d11-57cc-46e7-9f55-9c9120621466-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.686988 4786 generic.go:334] "Generic (PLEG): container finished" podID="543ff50b-2b00-4425-9842-6a02a6b5d3c5" containerID="5a2578412427231cd0a7254ca8d16d3a4fc8cd61194df2720a46540a928db8d4" exitCode=0 Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.687041 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"543ff50b-2b00-4425-9842-6a02a6b5d3c5","Type":"ContainerDied","Data":"5a2578412427231cd0a7254ca8d16d3a4fc8cd61194df2720a46540a928db8d4"} Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.687065 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"543ff50b-2b00-4425-9842-6a02a6b5d3c5","Type":"ContainerDied","Data":"608b55785bfb5a2a4c98344ca977bed47a1f368ae13ed16e1ba64bd8bd2fc9cd"} Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.687081 4786 scope.go:117] "RemoveContainer" containerID="5a2578412427231cd0a7254ca8d16d3a4fc8cd61194df2720a46540a928db8d4" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.687164 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.694513 4786 generic.go:334] "Generic (PLEG): container finished" podID="c8257d11-57cc-46e7-9f55-9c9120621466" containerID="0ada40c26e05c48c7ac6f4e15bfe8e9499be2f8841e21c5783d6802c4e3f4b31" exitCode=0 Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.694573 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.694596 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c8257d11-57cc-46e7-9f55-9c9120621466","Type":"ContainerDied","Data":"0ada40c26e05c48c7ac6f4e15bfe8e9499be2f8841e21c5783d6802c4e3f4b31"} Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.694619 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c8257d11-57cc-46e7-9f55-9c9120621466","Type":"ContainerDied","Data":"f1a2004539ba46242ce43d824259c9ff1488083711e06e347f60ce12a66ac5ab"} Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.712570 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.721173 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.722553 4786 scope.go:117] "RemoveContainer" containerID="4cf059fb2a370cf51dcfb3f92f5dc869b3fb3751f2d025e67d6289d19499d638" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.729102 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.741174 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.747310 4786 scope.go:117] "RemoveContainer" containerID="5a2578412427231cd0a7254ca8d16d3a4fc8cd61194df2720a46540a928db8d4" Oct 02 07:03:50 crc kubenswrapper[4786]: E1002 07:03:50.748091 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a2578412427231cd0a7254ca8d16d3a4fc8cd61194df2720a46540a928db8d4\": container with ID starting with 5a2578412427231cd0a7254ca8d16d3a4fc8cd61194df2720a46540a928db8d4 not found: ID does not exist" containerID="5a2578412427231cd0a7254ca8d16d3a4fc8cd61194df2720a46540a928db8d4" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.748120 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a2578412427231cd0a7254ca8d16d3a4fc8cd61194df2720a46540a928db8d4"} err="failed to get container status \"5a2578412427231cd0a7254ca8d16d3a4fc8cd61194df2720a46540a928db8d4\": rpc error: code = NotFound desc = could not find container \"5a2578412427231cd0a7254ca8d16d3a4fc8cd61194df2720a46540a928db8d4\": container with ID starting with 5a2578412427231cd0a7254ca8d16d3a4fc8cd61194df2720a46540a928db8d4 not found: ID does not exist" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.748145 4786 scope.go:117] "RemoveContainer" containerID="4cf059fb2a370cf51dcfb3f92f5dc869b3fb3751f2d025e67d6289d19499d638" Oct 02 07:03:50 crc kubenswrapper[4786]: E1002 07:03:50.748413 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cf059fb2a370cf51dcfb3f92f5dc869b3fb3751f2d025e67d6289d19499d638\": container with ID starting with 4cf059fb2a370cf51dcfb3f92f5dc869b3fb3751f2d025e67d6289d19499d638 not found: ID does not exist" containerID="4cf059fb2a370cf51dcfb3f92f5dc869b3fb3751f2d025e67d6289d19499d638" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.749473 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cf059fb2a370cf51dcfb3f92f5dc869b3fb3751f2d025e67d6289d19499d638"} err="failed to get container status \"4cf059fb2a370cf51dcfb3f92f5dc869b3fb3751f2d025e67d6289d19499d638\": rpc error: code = NotFound desc = could not find container \"4cf059fb2a370cf51dcfb3f92f5dc869b3fb3751f2d025e67d6289d19499d638\": container with ID starting with 4cf059fb2a370cf51dcfb3f92f5dc869b3fb3751f2d025e67d6289d19499d638 not found: ID does not exist" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.749490 4786 scope.go:117] "RemoveContainer" containerID="0ada40c26e05c48c7ac6f4e15bfe8e9499be2f8841e21c5783d6802c4e3f4b31" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.750705 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 07:03:50 crc kubenswrapper[4786]: E1002 07:03:50.750991 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="543ff50b-2b00-4425-9842-6a02a6b5d3c5" containerName="nova-metadata-log" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.751010 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="543ff50b-2b00-4425-9842-6a02a6b5d3c5" containerName="nova-metadata-log" Oct 02 07:03:50 crc kubenswrapper[4786]: E1002 07:03:50.751023 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8257d11-57cc-46e7-9f55-9c9120621466" containerName="nova-api-api" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.751029 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8257d11-57cc-46e7-9f55-9c9120621466" containerName="nova-api-api" Oct 02 07:03:50 crc kubenswrapper[4786]: E1002 07:03:50.751055 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="543ff50b-2b00-4425-9842-6a02a6b5d3c5" containerName="nova-metadata-metadata" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.751061 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="543ff50b-2b00-4425-9842-6a02a6b5d3c5" containerName="nova-metadata-metadata" Oct 02 07:03:50 crc kubenswrapper[4786]: E1002 07:03:50.751071 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8257d11-57cc-46e7-9f55-9c9120621466" containerName="nova-api-log" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.751076 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8257d11-57cc-46e7-9f55-9c9120621466" containerName="nova-api-log" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.751222 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8257d11-57cc-46e7-9f55-9c9120621466" containerName="nova-api-api" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.751236 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="543ff50b-2b00-4425-9842-6a02a6b5d3c5" containerName="nova-metadata-log" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.751251 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="543ff50b-2b00-4425-9842-6a02a6b5d3c5" containerName="nova-metadata-metadata" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.751267 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8257d11-57cc-46e7-9f55-9c9120621466" containerName="nova-api-log" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.753353 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.755214 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.755881 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.768698 4786 scope.go:117] "RemoveContainer" containerID="a45c40fa91b52ecb738ac0377d5b2ae1d7ec4160e3f9ec586f80a518748d44f1" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.771200 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.772768 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.774769 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw2kx\" (UniqueName: \"kubernetes.io/projected/15256076-c838-461c-98d9-4ae4883be465-kube-api-access-jw2kx\") pod \"nova-metadata-0\" (UID: \"15256076-c838-461c-98d9-4ae4883be465\") " pod="openstack/nova-metadata-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.775350 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15256076-c838-461c-98d9-4ae4883be465-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"15256076-c838-461c-98d9-4ae4883be465\") " pod="openstack/nova-metadata-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.775441 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15256076-c838-461c-98d9-4ae4883be465-config-data\") pod \"nova-metadata-0\" (UID: \"15256076-c838-461c-98d9-4ae4883be465\") " pod="openstack/nova-metadata-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.775520 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15256076-c838-461c-98d9-4ae4883be465-logs\") pod \"nova-metadata-0\" (UID: \"15256076-c838-461c-98d9-4ae4883be465\") " pod="openstack/nova-metadata-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.775603 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15256076-c838-461c-98d9-4ae4883be465-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"15256076-c838-461c-98d9-4ae4883be465\") " pod="openstack/nova-metadata-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.776042 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.776207 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.776321 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.779010 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.783088 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.801829 4786 scope.go:117] "RemoveContainer" containerID="0ada40c26e05c48c7ac6f4e15bfe8e9499be2f8841e21c5783d6802c4e3f4b31" Oct 02 07:03:50 crc kubenswrapper[4786]: E1002 07:03:50.802660 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ada40c26e05c48c7ac6f4e15bfe8e9499be2f8841e21c5783d6802c4e3f4b31\": container with ID starting with 0ada40c26e05c48c7ac6f4e15bfe8e9499be2f8841e21c5783d6802c4e3f4b31 not found: ID does not exist" containerID="0ada40c26e05c48c7ac6f4e15bfe8e9499be2f8841e21c5783d6802c4e3f4b31" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.802713 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ada40c26e05c48c7ac6f4e15bfe8e9499be2f8841e21c5783d6802c4e3f4b31"} err="failed to get container status \"0ada40c26e05c48c7ac6f4e15bfe8e9499be2f8841e21c5783d6802c4e3f4b31\": rpc error: code = NotFound desc = could not find container \"0ada40c26e05c48c7ac6f4e15bfe8e9499be2f8841e21c5783d6802c4e3f4b31\": container with ID starting with 0ada40c26e05c48c7ac6f4e15bfe8e9499be2f8841e21c5783d6802c4e3f4b31 not found: ID does not exist" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.802737 4786 scope.go:117] "RemoveContainer" containerID="a45c40fa91b52ecb738ac0377d5b2ae1d7ec4160e3f9ec586f80a518748d44f1" Oct 02 07:03:50 crc kubenswrapper[4786]: E1002 07:03:50.803102 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a45c40fa91b52ecb738ac0377d5b2ae1d7ec4160e3f9ec586f80a518748d44f1\": container with ID starting with a45c40fa91b52ecb738ac0377d5b2ae1d7ec4160e3f9ec586f80a518748d44f1 not found: ID does not exist" containerID="a45c40fa91b52ecb738ac0377d5b2ae1d7ec4160e3f9ec586f80a518748d44f1" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.803126 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45c40fa91b52ecb738ac0377d5b2ae1d7ec4160e3f9ec586f80a518748d44f1"} err="failed to get container status \"a45c40fa91b52ecb738ac0377d5b2ae1d7ec4160e3f9ec586f80a518748d44f1\": rpc error: code = NotFound desc = could not find container \"a45c40fa91b52ecb738ac0377d5b2ae1d7ec4160e3f9ec586f80a518748d44f1\": container with ID starting with a45c40fa91b52ecb738ac0377d5b2ae1d7ec4160e3f9ec586f80a518748d44f1 not found: ID does not exist" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.876758 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw2kx\" (UniqueName: \"kubernetes.io/projected/15256076-c838-461c-98d9-4ae4883be465-kube-api-access-jw2kx\") pod \"nova-metadata-0\" (UID: \"15256076-c838-461c-98d9-4ae4883be465\") " pod="openstack/nova-metadata-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.876822 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bshcc\" (UniqueName: \"kubernetes.io/projected/5925ec23-28fa-42cb-9f3b-aa96d2efab12-kube-api-access-bshcc\") pod \"nova-api-0\" (UID: \"5925ec23-28fa-42cb-9f3b-aa96d2efab12\") " pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.876868 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15256076-c838-461c-98d9-4ae4883be465-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"15256076-c838-461c-98d9-4ae4883be465\") " pod="openstack/nova-metadata-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.876891 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15256076-c838-461c-98d9-4ae4883be465-config-data\") pod \"nova-metadata-0\" (UID: \"15256076-c838-461c-98d9-4ae4883be465\") " pod="openstack/nova-metadata-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.876915 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5925ec23-28fa-42cb-9f3b-aa96d2efab12-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5925ec23-28fa-42cb-9f3b-aa96d2efab12\") " pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.876930 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15256076-c838-461c-98d9-4ae4883be465-logs\") pod \"nova-metadata-0\" (UID: \"15256076-c838-461c-98d9-4ae4883be465\") " pod="openstack/nova-metadata-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.876952 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15256076-c838-461c-98d9-4ae4883be465-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"15256076-c838-461c-98d9-4ae4883be465\") " pod="openstack/nova-metadata-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.876970 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5925ec23-28fa-42cb-9f3b-aa96d2efab12-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5925ec23-28fa-42cb-9f3b-aa96d2efab12\") " pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.876996 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5925ec23-28fa-42cb-9f3b-aa96d2efab12-public-tls-certs\") pod \"nova-api-0\" (UID: \"5925ec23-28fa-42cb-9f3b-aa96d2efab12\") " pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.877016 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5925ec23-28fa-42cb-9f3b-aa96d2efab12-logs\") pod \"nova-api-0\" (UID: \"5925ec23-28fa-42cb-9f3b-aa96d2efab12\") " pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.877031 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5925ec23-28fa-42cb-9f3b-aa96d2efab12-config-data\") pod \"nova-api-0\" (UID: \"5925ec23-28fa-42cb-9f3b-aa96d2efab12\") " pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.877326 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15256076-c838-461c-98d9-4ae4883be465-logs\") pod \"nova-metadata-0\" (UID: \"15256076-c838-461c-98d9-4ae4883be465\") " pod="openstack/nova-metadata-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.880710 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15256076-c838-461c-98d9-4ae4883be465-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"15256076-c838-461c-98d9-4ae4883be465\") " pod="openstack/nova-metadata-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.881103 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15256076-c838-461c-98d9-4ae4883be465-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"15256076-c838-461c-98d9-4ae4883be465\") " pod="openstack/nova-metadata-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.881222 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15256076-c838-461c-98d9-4ae4883be465-config-data\") pod \"nova-metadata-0\" (UID: \"15256076-c838-461c-98d9-4ae4883be465\") " pod="openstack/nova-metadata-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.889843 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw2kx\" (UniqueName: \"kubernetes.io/projected/15256076-c838-461c-98d9-4ae4883be465-kube-api-access-jw2kx\") pod \"nova-metadata-0\" (UID: \"15256076-c838-461c-98d9-4ae4883be465\") " pod="openstack/nova-metadata-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.978274 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5925ec23-28fa-42cb-9f3b-aa96d2efab12-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5925ec23-28fa-42cb-9f3b-aa96d2efab12\") " pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.978319 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5925ec23-28fa-42cb-9f3b-aa96d2efab12-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5925ec23-28fa-42cb-9f3b-aa96d2efab12\") " pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.978353 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5925ec23-28fa-42cb-9f3b-aa96d2efab12-public-tls-certs\") pod \"nova-api-0\" (UID: \"5925ec23-28fa-42cb-9f3b-aa96d2efab12\") " pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.978377 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5925ec23-28fa-42cb-9f3b-aa96d2efab12-logs\") pod \"nova-api-0\" (UID: \"5925ec23-28fa-42cb-9f3b-aa96d2efab12\") " pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.978393 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5925ec23-28fa-42cb-9f3b-aa96d2efab12-config-data\") pod \"nova-api-0\" (UID: \"5925ec23-28fa-42cb-9f3b-aa96d2efab12\") " pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.978465 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bshcc\" (UniqueName: \"kubernetes.io/projected/5925ec23-28fa-42cb-9f3b-aa96d2efab12-kube-api-access-bshcc\") pod \"nova-api-0\" (UID: \"5925ec23-28fa-42cb-9f3b-aa96d2efab12\") " pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.979419 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5925ec23-28fa-42cb-9f3b-aa96d2efab12-logs\") pod \"nova-api-0\" (UID: \"5925ec23-28fa-42cb-9f3b-aa96d2efab12\") " pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.981529 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5925ec23-28fa-42cb-9f3b-aa96d2efab12-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5925ec23-28fa-42cb-9f3b-aa96d2efab12\") " pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.982171 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5925ec23-28fa-42cb-9f3b-aa96d2efab12-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5925ec23-28fa-42cb-9f3b-aa96d2efab12\") " pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.983067 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5925ec23-28fa-42cb-9f3b-aa96d2efab12-public-tls-certs\") pod \"nova-api-0\" (UID: \"5925ec23-28fa-42cb-9f3b-aa96d2efab12\") " pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.983392 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5925ec23-28fa-42cb-9f3b-aa96d2efab12-config-data\") pod \"nova-api-0\" (UID: \"5925ec23-28fa-42cb-9f3b-aa96d2efab12\") " pod="openstack/nova-api-0" Oct 02 07:03:50 crc kubenswrapper[4786]: I1002 07:03:50.998474 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bshcc\" (UniqueName: \"kubernetes.io/projected/5925ec23-28fa-42cb-9f3b-aa96d2efab12-kube-api-access-bshcc\") pod \"nova-api-0\" (UID: \"5925ec23-28fa-42cb-9f3b-aa96d2efab12\") " pod="openstack/nova-api-0" Oct 02 07:03:51 crc kubenswrapper[4786]: I1002 07:03:51.070181 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 07:03:51 crc kubenswrapper[4786]: I1002 07:03:51.088117 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 07:03:51 crc kubenswrapper[4786]: I1002 07:03:51.458632 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 07:03:51 crc kubenswrapper[4786]: I1002 07:03:51.464284 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 07:03:51 crc kubenswrapper[4786]: I1002 07:03:51.704329 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15256076-c838-461c-98d9-4ae4883be465","Type":"ContainerStarted","Data":"060a6db4445b0428e79afdad44ad50fbeb6d8e1f11ca5d0ff5b2062eab378a6c"} Oct 02 07:03:51 crc kubenswrapper[4786]: I1002 07:03:51.704512 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15256076-c838-461c-98d9-4ae4883be465","Type":"ContainerStarted","Data":"5c15853b6860f7b3037958a4518fb4f746287b64a9d5ae4a105ab969b29c707d"} Oct 02 07:03:51 crc kubenswrapper[4786]: I1002 07:03:51.705545 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5925ec23-28fa-42cb-9f3b-aa96d2efab12","Type":"ContainerStarted","Data":"df1b7dfbb214dfe88d7e676bbeee641d3dacd610c9fda2965a2909bba63e2a80"} Oct 02 07:03:51 crc kubenswrapper[4786]: I1002 07:03:51.705578 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5925ec23-28fa-42cb-9f3b-aa96d2efab12","Type":"ContainerStarted","Data":"7697fb282386725dca694e29aef293906c5f28b8e77f25e4ca6d3679a50423e9"} Oct 02 07:03:52 crc kubenswrapper[4786]: I1002 07:03:52.187023 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="543ff50b-2b00-4425-9842-6a02a6b5d3c5" path="/var/lib/kubelet/pods/543ff50b-2b00-4425-9842-6a02a6b5d3c5/volumes" Oct 02 07:03:52 crc kubenswrapper[4786]: I1002 07:03:52.187625 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8257d11-57cc-46e7-9f55-9c9120621466" path="/var/lib/kubelet/pods/c8257d11-57cc-46e7-9f55-9c9120621466/volumes" Oct 02 07:03:52 crc kubenswrapper[4786]: E1002 07:03:52.548506 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e85150e_5e6c_4676_8c0e_ed72db605bc0.slice\": RecentStats: unable to find data in memory cache]" Oct 02 07:03:52 crc kubenswrapper[4786]: I1002 07:03:52.715177 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15256076-c838-461c-98d9-4ae4883be465","Type":"ContainerStarted","Data":"8c28f079f46d34a2682621f672409cf08cc8e9702b11be956546cc8ffd859f1f"} Oct 02 07:03:52 crc kubenswrapper[4786]: I1002 07:03:52.716815 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5925ec23-28fa-42cb-9f3b-aa96d2efab12","Type":"ContainerStarted","Data":"ea2efcec770132b7ce61b7888e15dcfd84decd7592f11950f73f8c7ef03348d5"} Oct 02 07:03:52 crc kubenswrapper[4786]: I1002 07:03:52.730069 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.730056282 podStartE2EDuration="2.730056282s" podCreationTimestamp="2025-10-02 07:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:03:52.726460092 +0000 UTC m=+1042.847643223" watchObservedRunningTime="2025-10-02 07:03:52.730056282 +0000 UTC m=+1042.851239413" Oct 02 07:03:52 crc kubenswrapper[4786]: I1002 07:03:52.742005 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.74199429 podStartE2EDuration="2.74199429s" podCreationTimestamp="2025-10-02 07:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:03:52.740388687 +0000 UTC m=+1042.861571838" watchObservedRunningTime="2025-10-02 07:03:52.74199429 +0000 UTC m=+1042.863177421" Oct 02 07:03:54 crc kubenswrapper[4786]: I1002 07:03:54.013816 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 07:03:56 crc kubenswrapper[4786]: I1002 07:03:56.071661 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 07:03:56 crc kubenswrapper[4786]: I1002 07:03:56.071944 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 07:03:59 crc kubenswrapper[4786]: I1002 07:03:59.014465 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 07:03:59 crc kubenswrapper[4786]: I1002 07:03:59.034407 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 07:03:59 crc kubenswrapper[4786]: I1002 07:03:59.153026 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 07:03:59 crc kubenswrapper[4786]: I1002 07:03:59.786471 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 07:04:01 crc kubenswrapper[4786]: I1002 07:04:01.071552 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 07:04:01 crc kubenswrapper[4786]: I1002 07:04:01.071786 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 07:04:01 crc kubenswrapper[4786]: I1002 07:04:01.088895 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 07:04:01 crc kubenswrapper[4786]: I1002 07:04:01.088936 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 07:04:02 crc kubenswrapper[4786]: I1002 07:04:02.083788 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="15256076-c838-461c-98d9-4ae4883be465" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 07:04:02 crc kubenswrapper[4786]: I1002 07:04:02.083809 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="15256076-c838-461c-98d9-4ae4883be465" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 07:04:02 crc kubenswrapper[4786]: I1002 07:04:02.098795 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5925ec23-28fa-42cb-9f3b-aa96d2efab12" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 07:04:02 crc kubenswrapper[4786]: I1002 07:04:02.098827 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5925ec23-28fa-42cb-9f3b-aa96d2efab12" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 07:04:02 crc kubenswrapper[4786]: E1002 07:04:02.741143 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e85150e_5e6c_4676_8c0e_ed72db605bc0.slice\": RecentStats: unable to find data in memory cache]" Oct 02 07:04:11 crc kubenswrapper[4786]: I1002 07:04:11.076714 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 07:04:11 crc kubenswrapper[4786]: I1002 07:04:11.077055 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 07:04:11 crc kubenswrapper[4786]: I1002 07:04:11.080477 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 07:04:11 crc kubenswrapper[4786]: I1002 07:04:11.080907 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 07:04:11 crc kubenswrapper[4786]: I1002 07:04:11.094642 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 07:04:11 crc kubenswrapper[4786]: I1002 07:04:11.094940 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 07:04:11 crc kubenswrapper[4786]: I1002 07:04:11.095968 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 07:04:11 crc kubenswrapper[4786]: I1002 07:04:11.099298 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 07:04:11 crc kubenswrapper[4786]: I1002 07:04:11.842758 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 07:04:11 crc kubenswrapper[4786]: I1002 07:04:11.847160 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 07:04:12 crc kubenswrapper[4786]: E1002 07:04:12.914937 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e85150e_5e6c_4676_8c0e_ed72db605bc0.slice\": RecentStats: unable to find data in memory cache]" Oct 02 07:04:17 crc kubenswrapper[4786]: I1002 07:04:17.949868 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 07:04:18 crc kubenswrapper[4786]: I1002 07:04:18.587166 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 07:04:21 crc kubenswrapper[4786]: I1002 07:04:21.079808 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="7a34253b-beed-468f-8bad-82366a5eb5c3" containerName="rabbitmq" containerID="cri-o://270f654eca7e4dcacf7de9e7f064eb2b0631a12635565ed6f9453c98f853f3f4" gracePeriod=604797 Oct 02 07:04:21 crc kubenswrapper[4786]: I1002 07:04:21.689369 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" containerName="rabbitmq" containerID="cri-o://d11020b391664126e8846d6b64d18e6efa22a39eedf2a85372e338a8359e5fbf" gracePeriod=604797 Oct 02 07:04:23 crc kubenswrapper[4786]: E1002 07:04:23.103707 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e85150e_5e6c_4676_8c0e_ed72db605bc0.slice\": RecentStats: unable to find data in memory cache]" Oct 02 07:04:26 crc kubenswrapper[4786]: I1002 07:04:26.276534 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="7a34253b-beed-468f-8bad-82366a5eb5c3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.96:5671: connect: connection refused" Oct 02 07:04:26 crc kubenswrapper[4786]: I1002 07:04:26.516931 4786 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.97:5671: connect: connection refused" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.444708 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.549247 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7a34253b-beed-468f-8bad-82366a5eb5c3-pod-info\") pod \"7a34253b-beed-468f-8bad-82366a5eb5c3\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.549280 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7a34253b-beed-468f-8bad-82366a5eb5c3-server-conf\") pod \"7a34253b-beed-468f-8bad-82366a5eb5c3\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.549310 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99rrf\" (UniqueName: \"kubernetes.io/projected/7a34253b-beed-468f-8bad-82366a5eb5c3-kube-api-access-99rrf\") pod \"7a34253b-beed-468f-8bad-82366a5eb5c3\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.549353 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a34253b-beed-468f-8bad-82366a5eb5c3-config-data\") pod \"7a34253b-beed-468f-8bad-82366a5eb5c3\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.549398 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-erlang-cookie\") pod \"7a34253b-beed-468f-8bad-82366a5eb5c3\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.549528 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7a34253b-beed-468f-8bad-82366a5eb5c3-erlang-cookie-secret\") pod \"7a34253b-beed-468f-8bad-82366a5eb5c3\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.549552 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7a34253b-beed-468f-8bad-82366a5eb5c3-plugins-conf\") pod \"7a34253b-beed-468f-8bad-82366a5eb5c3\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.549614 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"7a34253b-beed-468f-8bad-82366a5eb5c3\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.549635 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-plugins\") pod \"7a34253b-beed-468f-8bad-82366a5eb5c3\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.549652 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-confd\") pod \"7a34253b-beed-468f-8bad-82366a5eb5c3\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.549669 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-tls\") pod \"7a34253b-beed-468f-8bad-82366a5eb5c3\" (UID: \"7a34253b-beed-468f-8bad-82366a5eb5c3\") " Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.550342 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "7a34253b-beed-468f-8bad-82366a5eb5c3" (UID: "7a34253b-beed-468f-8bad-82366a5eb5c3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.550412 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "7a34253b-beed-468f-8bad-82366a5eb5c3" (UID: "7a34253b-beed-468f-8bad-82366a5eb5c3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.550490 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a34253b-beed-468f-8bad-82366a5eb5c3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "7a34253b-beed-468f-8bad-82366a5eb5c3" (UID: "7a34253b-beed-468f-8bad-82366a5eb5c3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.554147 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a34253b-beed-468f-8bad-82366a5eb5c3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "7a34253b-beed-468f-8bad-82366a5eb5c3" (UID: "7a34253b-beed-468f-8bad-82366a5eb5c3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.554512 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/7a34253b-beed-468f-8bad-82366a5eb5c3-pod-info" (OuterVolumeSpecName: "pod-info") pod "7a34253b-beed-468f-8bad-82366a5eb5c3" (UID: "7a34253b-beed-468f-8bad-82366a5eb5c3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.554661 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "7a34253b-beed-468f-8bad-82366a5eb5c3" (UID: "7a34253b-beed-468f-8bad-82366a5eb5c3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.554715 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a34253b-beed-468f-8bad-82366a5eb5c3-kube-api-access-99rrf" (OuterVolumeSpecName: "kube-api-access-99rrf") pod "7a34253b-beed-468f-8bad-82366a5eb5c3" (UID: "7a34253b-beed-468f-8bad-82366a5eb5c3"). InnerVolumeSpecName "kube-api-access-99rrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.568834 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "7a34253b-beed-468f-8bad-82366a5eb5c3" (UID: "7a34253b-beed-468f-8bad-82366a5eb5c3"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.568851 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a34253b-beed-468f-8bad-82366a5eb5c3-config-data" (OuterVolumeSpecName: "config-data") pod "7a34253b-beed-468f-8bad-82366a5eb5c3" (UID: "7a34253b-beed-468f-8bad-82366a5eb5c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.584022 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a34253b-beed-468f-8bad-82366a5eb5c3-server-conf" (OuterVolumeSpecName: "server-conf") pod "7a34253b-beed-468f-8bad-82366a5eb5c3" (UID: "7a34253b-beed-468f-8bad-82366a5eb5c3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.621720 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "7a34253b-beed-468f-8bad-82366a5eb5c3" (UID: "7a34253b-beed-468f-8bad-82366a5eb5c3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.651782 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.651809 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.651819 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.651827 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.651835 4786 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7a34253b-beed-468f-8bad-82366a5eb5c3-pod-info\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.651842 4786 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7a34253b-beed-468f-8bad-82366a5eb5c3-server-conf\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.651849 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99rrf\" (UniqueName: \"kubernetes.io/projected/7a34253b-beed-468f-8bad-82366a5eb5c3-kube-api-access-99rrf\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.651857 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7a34253b-beed-468f-8bad-82366a5eb5c3-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.651864 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7a34253b-beed-468f-8bad-82366a5eb5c3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.651872 4786 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7a34253b-beed-468f-8bad-82366a5eb5c3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.651880 4786 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7a34253b-beed-468f-8bad-82366a5eb5c3-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.668234 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.752871 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.942607 4786 generic.go:334] "Generic (PLEG): container finished" podID="b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" containerID="d11020b391664126e8846d6b64d18e6efa22a39eedf2a85372e338a8359e5fbf" exitCode=0 Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.942872 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94","Type":"ContainerDied","Data":"d11020b391664126e8846d6b64d18e6efa22a39eedf2a85372e338a8359e5fbf"} Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.944296 4786 generic.go:334] "Generic (PLEG): container finished" podID="7a34253b-beed-468f-8bad-82366a5eb5c3" containerID="270f654eca7e4dcacf7de9e7f064eb2b0631a12635565ed6f9453c98f853f3f4" exitCode=0 Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.944334 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7a34253b-beed-468f-8bad-82366a5eb5c3","Type":"ContainerDied","Data":"270f654eca7e4dcacf7de9e7f064eb2b0631a12635565ed6f9453c98f853f3f4"} Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.944357 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7a34253b-beed-468f-8bad-82366a5eb5c3","Type":"ContainerDied","Data":"cd330b6f989a2bff319e87567b1f4b2c6b8b8a75685ceb62a02de19845376123"} Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.944375 4786 scope.go:117] "RemoveContainer" containerID="270f654eca7e4dcacf7de9e7f064eb2b0631a12635565ed6f9453c98f853f3f4" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.944476 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.965994 4786 scope.go:117] "RemoveContainer" containerID="38e87a1f59faf04a43c609aa662d827fd81aad0a9d27e60d93ad5dfece74d190" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.973069 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.993428 4786 scope.go:117] "RemoveContainer" containerID="270f654eca7e4dcacf7de9e7f064eb2b0631a12635565ed6f9453c98f853f3f4" Oct 02 07:04:27 crc kubenswrapper[4786]: E1002 07:04:27.993932 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"270f654eca7e4dcacf7de9e7f064eb2b0631a12635565ed6f9453c98f853f3f4\": container with ID starting with 270f654eca7e4dcacf7de9e7f064eb2b0631a12635565ed6f9453c98f853f3f4 not found: ID does not exist" containerID="270f654eca7e4dcacf7de9e7f064eb2b0631a12635565ed6f9453c98f853f3f4" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.993980 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"270f654eca7e4dcacf7de9e7f064eb2b0631a12635565ed6f9453c98f853f3f4"} err="failed to get container status \"270f654eca7e4dcacf7de9e7f064eb2b0631a12635565ed6f9453c98f853f3f4\": rpc error: code = NotFound desc = could not find container \"270f654eca7e4dcacf7de9e7f064eb2b0631a12635565ed6f9453c98f853f3f4\": container with ID starting with 270f654eca7e4dcacf7de9e7f064eb2b0631a12635565ed6f9453c98f853f3f4 not found: ID does not exist" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.994007 4786 scope.go:117] "RemoveContainer" containerID="38e87a1f59faf04a43c609aa662d827fd81aad0a9d27e60d93ad5dfece74d190" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.994167 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 07:04:27 crc kubenswrapper[4786]: E1002 07:04:27.994444 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38e87a1f59faf04a43c609aa662d827fd81aad0a9d27e60d93ad5dfece74d190\": container with ID starting with 38e87a1f59faf04a43c609aa662d827fd81aad0a9d27e60d93ad5dfece74d190 not found: ID does not exist" containerID="38e87a1f59faf04a43c609aa662d827fd81aad0a9d27e60d93ad5dfece74d190" Oct 02 07:04:27 crc kubenswrapper[4786]: I1002 07:04:27.994488 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e87a1f59faf04a43c609aa662d827fd81aad0a9d27e60d93ad5dfece74d190"} err="failed to get container status \"38e87a1f59faf04a43c609aa662d827fd81aad0a9d27e60d93ad5dfece74d190\": rpc error: code = NotFound desc = could not find container \"38e87a1f59faf04a43c609aa662d827fd81aad0a9d27e60d93ad5dfece74d190\": container with ID starting with 38e87a1f59faf04a43c609aa662d827fd81aad0a9d27e60d93ad5dfece74d190 not found: ID does not exist" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.007754 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 07:04:28 crc kubenswrapper[4786]: E1002 07:04:28.008286 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a34253b-beed-468f-8bad-82366a5eb5c3" containerName="rabbitmq" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.008368 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a34253b-beed-468f-8bad-82366a5eb5c3" containerName="rabbitmq" Oct 02 07:04:28 crc kubenswrapper[4786]: E1002 07:04:28.008433 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a34253b-beed-468f-8bad-82366a5eb5c3" containerName="setup-container" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.008481 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a34253b-beed-468f-8bad-82366a5eb5c3" containerName="setup-container" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.008747 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a34253b-beed-468f-8bad-82366a5eb5c3" containerName="rabbitmq" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.009931 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.012389 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.012733 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.012732 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.012790 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vbf4k" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.012853 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.012798 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.012995 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.013830 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.020842 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.056456 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-plugins-conf\") pod \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.056644 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-confd\") pod \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.056786 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g4zj\" (UniqueName: \"kubernetes.io/projected/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-kube-api-access-4g4zj\") pod \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.056859 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-config-data\") pod \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.056968 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-erlang-cookie-secret\") pod \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.057086 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-tls\") pod \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.057153 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-server-conf\") pod \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.057251 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-erlang-cookie\") pod \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.057379 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-plugins\") pod \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.057462 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.057535 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-pod-info\") pod \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\" (UID: \"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94\") " Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.056902 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" (UID: "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.057909 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" (UID: "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.057976 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" (UID: "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.058079 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/581ccdf3-7a38-4bb7-93ac-035207098fb7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.058188 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/581ccdf3-7a38-4bb7-93ac-035207098fb7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.058281 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/581ccdf3-7a38-4bb7-93ac-035207098fb7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.058417 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/581ccdf3-7a38-4bb7-93ac-035207098fb7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.058529 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/581ccdf3-7a38-4bb7-93ac-035207098fb7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.058728 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/581ccdf3-7a38-4bb7-93ac-035207098fb7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.058817 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/581ccdf3-7a38-4bb7-93ac-035207098fb7-config-data\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.059061 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.059175 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpxgd\" (UniqueName: \"kubernetes.io/projected/581ccdf3-7a38-4bb7-93ac-035207098fb7-kube-api-access-qpxgd\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.059237 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/581ccdf3-7a38-4bb7-93ac-035207098fb7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.059494 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/581ccdf3-7a38-4bb7-93ac-035207098fb7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.059606 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.059729 4786 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.059783 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.060276 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" (UID: "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.060638 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-pod-info" (OuterVolumeSpecName: "pod-info") pod "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" (UID: "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.063360 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" (UID: "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.065895 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" (UID: "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.065999 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-kube-api-access-4g4zj" (OuterVolumeSpecName: "kube-api-access-4g4zj") pod "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" (UID: "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94"). InnerVolumeSpecName "kube-api-access-4g4zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.094121 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-config-data" (OuterVolumeSpecName: "config-data") pod "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" (UID: "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.107037 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-server-conf" (OuterVolumeSpecName: "server-conf") pod "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" (UID: "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.133794 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" (UID: "b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.161756 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpxgd\" (UniqueName: \"kubernetes.io/projected/581ccdf3-7a38-4bb7-93ac-035207098fb7-kube-api-access-qpxgd\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.161888 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/581ccdf3-7a38-4bb7-93ac-035207098fb7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.161968 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/581ccdf3-7a38-4bb7-93ac-035207098fb7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.162064 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/581ccdf3-7a38-4bb7-93ac-035207098fb7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.162152 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/581ccdf3-7a38-4bb7-93ac-035207098fb7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.162242 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/581ccdf3-7a38-4bb7-93ac-035207098fb7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.162324 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/581ccdf3-7a38-4bb7-93ac-035207098fb7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.162435 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/581ccdf3-7a38-4bb7-93ac-035207098fb7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.162534 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/581ccdf3-7a38-4bb7-93ac-035207098fb7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.162627 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/581ccdf3-7a38-4bb7-93ac-035207098fb7-config-data\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.162730 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.162859 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.162928 4786 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-pod-info\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.162990 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.163045 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g4zj\" (UniqueName: \"kubernetes.io/projected/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-kube-api-access-4g4zj\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.163094 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.163142 4786 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.163199 4786 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.163262 4786 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94-server-conf\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.163236 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/581ccdf3-7a38-4bb7-93ac-035207098fb7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.163419 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.163637 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/581ccdf3-7a38-4bb7-93ac-035207098fb7-config-data\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.163868 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/581ccdf3-7a38-4bb7-93ac-035207098fb7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.163137 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/581ccdf3-7a38-4bb7-93ac-035207098fb7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.163427 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/581ccdf3-7a38-4bb7-93ac-035207098fb7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.165514 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/581ccdf3-7a38-4bb7-93ac-035207098fb7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.165533 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/581ccdf3-7a38-4bb7-93ac-035207098fb7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.165541 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/581ccdf3-7a38-4bb7-93ac-035207098fb7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.166947 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/581ccdf3-7a38-4bb7-93ac-035207098fb7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.175609 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpxgd\" (UniqueName: \"kubernetes.io/projected/581ccdf3-7a38-4bb7-93ac-035207098fb7-kube-api-access-qpxgd\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.180891 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.188828 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a34253b-beed-468f-8bad-82366a5eb5c3" path="/var/lib/kubelet/pods/7a34253b-beed-468f-8bad-82366a5eb5c3/volumes" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.198773 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"581ccdf3-7a38-4bb7-93ac-035207098fb7\") " pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.264786 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.341126 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.426724 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76c6ddb9ff-88b27"] Oct 02 07:04:28 crc kubenswrapper[4786]: E1002 07:04:28.427221 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" containerName="rabbitmq" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.427237 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" containerName="rabbitmq" Oct 02 07:04:28 crc kubenswrapper[4786]: E1002 07:04:28.427261 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" containerName="setup-container" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.427267 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" containerName="setup-container" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.427419 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" containerName="rabbitmq" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.430066 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.435254 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.442531 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76c6ddb9ff-88b27"] Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.468826 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-config\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.468864 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-ovsdbserver-sb\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.468933 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4b8l\" (UniqueName: \"kubernetes.io/projected/c3f19a04-1d20-417d-aa55-dbc2bf374d39-kube-api-access-r4b8l\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.468958 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-dns-svc\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.468986 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-dns-swift-storage-0\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.469017 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-openstack-edpm-ipam\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.469039 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-ovsdbserver-nb\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.570579 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4b8l\" (UniqueName: \"kubernetes.io/projected/c3f19a04-1d20-417d-aa55-dbc2bf374d39-kube-api-access-r4b8l\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.570919 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-dns-svc\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.571581 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-dns-svc\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.571654 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-dns-swift-storage-0\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.572182 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-dns-swift-storage-0\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.572302 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-openstack-edpm-ipam\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.572893 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-openstack-edpm-ipam\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.572921 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-ovsdbserver-nb\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.572966 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-config\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.573177 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-ovsdbserver-nb\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.573480 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-config\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.573531 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-ovsdbserver-sb\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.574178 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-ovsdbserver-sb\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.592407 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4b8l\" (UniqueName: \"kubernetes.io/projected/c3f19a04-1d20-417d-aa55-dbc2bf374d39-kube-api-access-r4b8l\") pod \"dnsmasq-dns-76c6ddb9ff-88b27\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.735790 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.752926 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.952790 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"581ccdf3-7a38-4bb7-93ac-035207098fb7","Type":"ContainerStarted","Data":"4b5e3872336f27737fdb26b7edff465f49761bd96529f14f5781910d2815f7fa"} Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.955604 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94","Type":"ContainerDied","Data":"541088e2f41b0e4885c54ba993463851753f6e92f27130e5f9d6c41d6d95886a"} Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.955649 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.955659 4786 scope.go:117] "RemoveContainer" containerID="d11020b391664126e8846d6b64d18e6efa22a39eedf2a85372e338a8359e5fbf" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.985452 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.988603 4786 scope.go:117] "RemoveContainer" containerID="aeadaf247e974c8121b9c913260efe601073c3a9a7d2a40863d88fcb35011d88" Oct 02 07:04:28 crc kubenswrapper[4786]: I1002 07:04:28.990851 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.002777 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.004147 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.006006 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.006339 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.006366 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.006529 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4p8pv" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.006739 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.006866 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.007600 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.015543 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.083137 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00373ed3-3b08-4040-9e05-fd042f541af6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.083174 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00373ed3-3b08-4040-9e05-fd042f541af6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.083191 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/00373ed3-3b08-4040-9e05-fd042f541af6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.083240 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00373ed3-3b08-4040-9e05-fd042f541af6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.083270 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00373ed3-3b08-4040-9e05-fd042f541af6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.083316 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.083331 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00373ed3-3b08-4040-9e05-fd042f541af6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.083370 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zs8f\" (UniqueName: \"kubernetes.io/projected/00373ed3-3b08-4040-9e05-fd042f541af6-kube-api-access-5zs8f\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.083390 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00373ed3-3b08-4040-9e05-fd042f541af6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.083489 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00373ed3-3b08-4040-9e05-fd042f541af6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.083520 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00373ed3-3b08-4040-9e05-fd042f541af6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.185422 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zs8f\" (UniqueName: \"kubernetes.io/projected/00373ed3-3b08-4040-9e05-fd042f541af6-kube-api-access-5zs8f\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.185643 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00373ed3-3b08-4040-9e05-fd042f541af6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.185809 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00373ed3-3b08-4040-9e05-fd042f541af6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.185852 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00373ed3-3b08-4040-9e05-fd042f541af6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.185893 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00373ed3-3b08-4040-9e05-fd042f541af6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.185927 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00373ed3-3b08-4040-9e05-fd042f541af6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.185940 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/00373ed3-3b08-4040-9e05-fd042f541af6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.186004 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00373ed3-3b08-4040-9e05-fd042f541af6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.186031 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00373ed3-3b08-4040-9e05-fd042f541af6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.186097 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.186115 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00373ed3-3b08-4040-9e05-fd042f541af6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.186604 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00373ed3-3b08-4040-9e05-fd042f541af6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.186868 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.187090 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00373ed3-3b08-4040-9e05-fd042f541af6-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.187249 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00373ed3-3b08-4040-9e05-fd042f541af6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.187794 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00373ed3-3b08-4040-9e05-fd042f541af6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.187927 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00373ed3-3b08-4040-9e05-fd042f541af6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.192288 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/00373ed3-3b08-4040-9e05-fd042f541af6-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.192337 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00373ed3-3b08-4040-9e05-fd042f541af6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.192653 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00373ed3-3b08-4040-9e05-fd042f541af6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.193590 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00373ed3-3b08-4040-9e05-fd042f541af6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.200751 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76c6ddb9ff-88b27"] Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.204458 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zs8f\" (UniqueName: \"kubernetes.io/projected/00373ed3-3b08-4040-9e05-fd042f541af6-kube-api-access-5zs8f\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.210400 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"00373ed3-3b08-4040-9e05-fd042f541af6\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.321563 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.739752 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 07:04:29 crc kubenswrapper[4786]: W1002 07:04:29.746779 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00373ed3_3b08_4040_9e05_fd042f541af6.slice/crio-e546acd8358be773afb8c9456cfbfba289f8134c080fb55774e2c044dc495e78 WatchSource:0}: Error finding container e546acd8358be773afb8c9456cfbfba289f8134c080fb55774e2c044dc495e78: Status 404 returned error can't find the container with id e546acd8358be773afb8c9456cfbfba289f8134c080fb55774e2c044dc495e78 Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.964368 4786 generic.go:334] "Generic (PLEG): container finished" podID="c3f19a04-1d20-417d-aa55-dbc2bf374d39" containerID="1747c6564d3607cc32a3d7fff56e61534c13270b863f7c85f83e508a88586547" exitCode=0 Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.964433 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" event={"ID":"c3f19a04-1d20-417d-aa55-dbc2bf374d39","Type":"ContainerDied","Data":"1747c6564d3607cc32a3d7fff56e61534c13270b863f7c85f83e508a88586547"} Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.964621 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" event={"ID":"c3f19a04-1d20-417d-aa55-dbc2bf374d39","Type":"ContainerStarted","Data":"a39055bc6e98afb3923258d3c36fd1566891ec54b0d1b6c2a6d03f2de5f74eb9"} Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.965751 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"00373ed3-3b08-4040-9e05-fd042f541af6","Type":"ContainerStarted","Data":"e546acd8358be773afb8c9456cfbfba289f8134c080fb55774e2c044dc495e78"} Oct 02 07:04:29 crc kubenswrapper[4786]: I1002 07:04:29.967529 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"581ccdf3-7a38-4bb7-93ac-035207098fb7","Type":"ContainerStarted","Data":"ba974aac15463b55d20f92e85860fb962ba14a1d0545ed24a189d6eecd57ef3e"} Oct 02 07:04:30 crc kubenswrapper[4786]: I1002 07:04:30.187851 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94" path="/var/lib/kubelet/pods/b8d55d69-26a7-4b9f-9b6f-d63b0dabfb94/volumes" Oct 02 07:04:30 crc kubenswrapper[4786]: I1002 07:04:30.975339 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" event={"ID":"c3f19a04-1d20-417d-aa55-dbc2bf374d39","Type":"ContainerStarted","Data":"1055324ad376191b7bc5b7e6939caf90e6b27c20cd03bd50f8e09373dc84edd4"} Oct 02 07:04:30 crc kubenswrapper[4786]: I1002 07:04:30.975521 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:30 crc kubenswrapper[4786]: I1002 07:04:30.977158 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"00373ed3-3b08-4040-9e05-fd042f541af6","Type":"ContainerStarted","Data":"daace577174f9f0c7ab0d422ef942399323c612416489aecd53c5596609c03e3"} Oct 02 07:04:30 crc kubenswrapper[4786]: I1002 07:04:30.993896 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" podStartSLOduration=2.9938819519999997 podStartE2EDuration="2.993881952s" podCreationTimestamp="2025-10-02 07:04:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:04:30.988263343 +0000 UTC m=+1081.109446484" watchObservedRunningTime="2025-10-02 07:04:30.993881952 +0000 UTC m=+1081.115065082" Oct 02 07:04:38 crc kubenswrapper[4786]: I1002 07:04:38.754852 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:38 crc kubenswrapper[4786]: I1002 07:04:38.792906 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-558cfbd59c-vlkt9"] Oct 02 07:04:38 crc kubenswrapper[4786]: I1002 07:04:38.876266 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fdcbb4567-8fktd"] Oct 02 07:04:38 crc kubenswrapper[4786]: I1002 07:04:38.877544 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:38 crc kubenswrapper[4786]: I1002 07:04:38.890800 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fdcbb4567-8fktd"] Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.026241 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" podUID="aa599fa1-31c3-4f98-a942-c67de6fe96e7" containerName="dnsmasq-dns" containerID="cri-o://1382e223f76d17ea21d4d68197de2473f46c2f0a384338d0f71e7972658b1cd2" gracePeriod=10 Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.032029 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-dns-svc\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.032087 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-ovsdbserver-sb\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.032282 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-openstack-edpm-ipam\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.032363 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-config\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.032423 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-ovsdbserver-nb\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.032521 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-dns-swift-storage-0\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.032586 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsd42\" (UniqueName: \"kubernetes.io/projected/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-kube-api-access-wsd42\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.134839 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-openstack-edpm-ipam\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.134921 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-config\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.134974 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-ovsdbserver-nb\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.135039 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-dns-swift-storage-0\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.135103 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsd42\" (UniqueName: \"kubernetes.io/projected/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-kube-api-access-wsd42\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.135126 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-dns-svc\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.135150 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-ovsdbserver-sb\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.135647 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-openstack-edpm-ipam\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.135739 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-config\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.135992 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-dns-svc\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.136004 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-ovsdbserver-nb\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.136521 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-dns-swift-storage-0\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.136704 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-ovsdbserver-sb\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.150528 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsd42\" (UniqueName: \"kubernetes.io/projected/1fc331cd-16f8-41c1-8a54-7259f7c5fecb-kube-api-access-wsd42\") pod \"dnsmasq-dns-fdcbb4567-8fktd\" (UID: \"1fc331cd-16f8-41c1-8a54-7259f7c5fecb\") " pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.198035 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.362654 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.440133 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-ovsdbserver-nb\") pod \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.440341 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbqvl\" (UniqueName: \"kubernetes.io/projected/aa599fa1-31c3-4f98-a942-c67de6fe96e7-kube-api-access-kbqvl\") pod \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.440386 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-config\") pod \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.440454 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-dns-swift-storage-0\") pod \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.440476 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-dns-svc\") pod \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.440502 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-ovsdbserver-sb\") pod \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\" (UID: \"aa599fa1-31c3-4f98-a942-c67de6fe96e7\") " Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.446134 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa599fa1-31c3-4f98-a942-c67de6fe96e7-kube-api-access-kbqvl" (OuterVolumeSpecName: "kube-api-access-kbqvl") pod "aa599fa1-31c3-4f98-a942-c67de6fe96e7" (UID: "aa599fa1-31c3-4f98-a942-c67de6fe96e7"). InnerVolumeSpecName "kube-api-access-kbqvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.474074 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa599fa1-31c3-4f98-a942-c67de6fe96e7" (UID: "aa599fa1-31c3-4f98-a942-c67de6fe96e7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.475725 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aa599fa1-31c3-4f98-a942-c67de6fe96e7" (UID: "aa599fa1-31c3-4f98-a942-c67de6fe96e7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.476438 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-config" (OuterVolumeSpecName: "config") pod "aa599fa1-31c3-4f98-a942-c67de6fe96e7" (UID: "aa599fa1-31c3-4f98-a942-c67de6fe96e7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.476584 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aa599fa1-31c3-4f98-a942-c67de6fe96e7" (UID: "aa599fa1-31c3-4f98-a942-c67de6fe96e7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.477133 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aa599fa1-31c3-4f98-a942-c67de6fe96e7" (UID: "aa599fa1-31c3-4f98-a942-c67de6fe96e7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.542337 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.542366 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbqvl\" (UniqueName: \"kubernetes.io/projected/aa599fa1-31c3-4f98-a942-c67de6fe96e7-kube-api-access-kbqvl\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.542377 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.542385 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.542393 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.542400 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa599fa1-31c3-4f98-a942-c67de6fe96e7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:39 crc kubenswrapper[4786]: I1002 07:04:39.574381 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fdcbb4567-8fktd"] Oct 02 07:04:40 crc kubenswrapper[4786]: I1002 07:04:40.034414 4786 generic.go:334] "Generic (PLEG): container finished" podID="aa599fa1-31c3-4f98-a942-c67de6fe96e7" containerID="1382e223f76d17ea21d4d68197de2473f46c2f0a384338d0f71e7972658b1cd2" exitCode=0 Oct 02 07:04:40 crc kubenswrapper[4786]: I1002 07:04:40.034464 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" Oct 02 07:04:40 crc kubenswrapper[4786]: I1002 07:04:40.034470 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" event={"ID":"aa599fa1-31c3-4f98-a942-c67de6fe96e7","Type":"ContainerDied","Data":"1382e223f76d17ea21d4d68197de2473f46c2f0a384338d0f71e7972658b1cd2"} Oct 02 07:04:40 crc kubenswrapper[4786]: I1002 07:04:40.034508 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-558cfbd59c-vlkt9" event={"ID":"aa599fa1-31c3-4f98-a942-c67de6fe96e7","Type":"ContainerDied","Data":"d3dcafcde0239eb6835d9c97b45697b8f7a450edbacdbcee5e4b6f16e9c387d2"} Oct 02 07:04:40 crc kubenswrapper[4786]: I1002 07:04:40.034525 4786 scope.go:117] "RemoveContainer" containerID="1382e223f76d17ea21d4d68197de2473f46c2f0a384338d0f71e7972658b1cd2" Oct 02 07:04:40 crc kubenswrapper[4786]: I1002 07:04:40.036138 4786 generic.go:334] "Generic (PLEG): container finished" podID="1fc331cd-16f8-41c1-8a54-7259f7c5fecb" containerID="9bf8c139711651ead6ae60a8dc74728dbc30b46fb61f2e9994e74b13e54438ff" exitCode=0 Oct 02 07:04:40 crc kubenswrapper[4786]: I1002 07:04:40.036165 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" event={"ID":"1fc331cd-16f8-41c1-8a54-7259f7c5fecb","Type":"ContainerDied","Data":"9bf8c139711651ead6ae60a8dc74728dbc30b46fb61f2e9994e74b13e54438ff"} Oct 02 07:04:40 crc kubenswrapper[4786]: I1002 07:04:40.036185 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" event={"ID":"1fc331cd-16f8-41c1-8a54-7259f7c5fecb","Type":"ContainerStarted","Data":"d7146884918d9fb4e06270b92f80a30be249b7a00318d7c2c669639c295b114b"} Oct 02 07:04:40 crc kubenswrapper[4786]: I1002 07:04:40.051718 4786 scope.go:117] "RemoveContainer" containerID="0f55e384f31c0f46a84ed92ed94e14f6bbf902a583e7203a0abd5cf275c6f36b" Oct 02 07:04:40 crc kubenswrapper[4786]: I1002 07:04:40.076540 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-558cfbd59c-vlkt9"] Oct 02 07:04:40 crc kubenswrapper[4786]: I1002 07:04:40.082179 4786 scope.go:117] "RemoveContainer" containerID="1382e223f76d17ea21d4d68197de2473f46c2f0a384338d0f71e7972658b1cd2" Oct 02 07:04:40 crc kubenswrapper[4786]: E1002 07:04:40.082507 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1382e223f76d17ea21d4d68197de2473f46c2f0a384338d0f71e7972658b1cd2\": container with ID starting with 1382e223f76d17ea21d4d68197de2473f46c2f0a384338d0f71e7972658b1cd2 not found: ID does not exist" containerID="1382e223f76d17ea21d4d68197de2473f46c2f0a384338d0f71e7972658b1cd2" Oct 02 07:04:40 crc kubenswrapper[4786]: I1002 07:04:40.082550 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1382e223f76d17ea21d4d68197de2473f46c2f0a384338d0f71e7972658b1cd2"} err="failed to get container status \"1382e223f76d17ea21d4d68197de2473f46c2f0a384338d0f71e7972658b1cd2\": rpc error: code = NotFound desc = could not find container \"1382e223f76d17ea21d4d68197de2473f46c2f0a384338d0f71e7972658b1cd2\": container with ID starting with 1382e223f76d17ea21d4d68197de2473f46c2f0a384338d0f71e7972658b1cd2 not found: ID does not exist" Oct 02 07:04:40 crc kubenswrapper[4786]: I1002 07:04:40.082572 4786 scope.go:117] "RemoveContainer" containerID="0f55e384f31c0f46a84ed92ed94e14f6bbf902a583e7203a0abd5cf275c6f36b" Oct 02 07:04:40 crc kubenswrapper[4786]: E1002 07:04:40.082900 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f55e384f31c0f46a84ed92ed94e14f6bbf902a583e7203a0abd5cf275c6f36b\": container with ID starting with 0f55e384f31c0f46a84ed92ed94e14f6bbf902a583e7203a0abd5cf275c6f36b not found: ID does not exist" containerID="0f55e384f31c0f46a84ed92ed94e14f6bbf902a583e7203a0abd5cf275c6f36b" Oct 02 07:04:40 crc kubenswrapper[4786]: I1002 07:04:40.082937 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f55e384f31c0f46a84ed92ed94e14f6bbf902a583e7203a0abd5cf275c6f36b"} err="failed to get container status \"0f55e384f31c0f46a84ed92ed94e14f6bbf902a583e7203a0abd5cf275c6f36b\": rpc error: code = NotFound desc = could not find container \"0f55e384f31c0f46a84ed92ed94e14f6bbf902a583e7203a0abd5cf275c6f36b\": container with ID starting with 0f55e384f31c0f46a84ed92ed94e14f6bbf902a583e7203a0abd5cf275c6f36b not found: ID does not exist" Oct 02 07:04:40 crc kubenswrapper[4786]: I1002 07:04:40.082997 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-558cfbd59c-vlkt9"] Oct 02 07:04:40 crc kubenswrapper[4786]: I1002 07:04:40.186988 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa599fa1-31c3-4f98-a942-c67de6fe96e7" path="/var/lib/kubelet/pods/aa599fa1-31c3-4f98-a942-c67de6fe96e7/volumes" Oct 02 07:04:41 crc kubenswrapper[4786]: I1002 07:04:41.044606 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" event={"ID":"1fc331cd-16f8-41c1-8a54-7259f7c5fecb","Type":"ContainerStarted","Data":"3f79ef8233ae65834e548b258e5029d8d3957602c6d4afd95353f1f1f93d92a5"} Oct 02 07:04:41 crc kubenswrapper[4786]: I1002 07:04:41.044800 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:41 crc kubenswrapper[4786]: I1002 07:04:41.064443 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" podStartSLOduration=3.064431442 podStartE2EDuration="3.064431442s" podCreationTimestamp="2025-10-02 07:04:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:04:41.058279628 +0000 UTC m=+1091.179462769" watchObservedRunningTime="2025-10-02 07:04:41.064431442 +0000 UTC m=+1091.185614573" Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.199996 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fdcbb4567-8fktd" Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.239082 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76c6ddb9ff-88b27"] Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.239275 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" podUID="c3f19a04-1d20-417d-aa55-dbc2bf374d39" containerName="dnsmasq-dns" containerID="cri-o://1055324ad376191b7bc5b7e6939caf90e6b27c20cd03bd50f8e09373dc84edd4" gracePeriod=10 Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.593088 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.789489 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4b8l\" (UniqueName: \"kubernetes.io/projected/c3f19a04-1d20-417d-aa55-dbc2bf374d39-kube-api-access-r4b8l\") pod \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.789561 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-openstack-edpm-ipam\") pod \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.789592 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-dns-swift-storage-0\") pod \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.789622 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-ovsdbserver-nb\") pod \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.789818 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-config\") pod \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.789838 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-ovsdbserver-sb\") pod \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.789863 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-dns-svc\") pod \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\" (UID: \"c3f19a04-1d20-417d-aa55-dbc2bf374d39\") " Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.795129 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3f19a04-1d20-417d-aa55-dbc2bf374d39-kube-api-access-r4b8l" (OuterVolumeSpecName: "kube-api-access-r4b8l") pod "c3f19a04-1d20-417d-aa55-dbc2bf374d39" (UID: "c3f19a04-1d20-417d-aa55-dbc2bf374d39"). InnerVolumeSpecName "kube-api-access-r4b8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.827142 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c3f19a04-1d20-417d-aa55-dbc2bf374d39" (UID: "c3f19a04-1d20-417d-aa55-dbc2bf374d39"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.827175 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c3f19a04-1d20-417d-aa55-dbc2bf374d39" (UID: "c3f19a04-1d20-417d-aa55-dbc2bf374d39"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.828178 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c3f19a04-1d20-417d-aa55-dbc2bf374d39" (UID: "c3f19a04-1d20-417d-aa55-dbc2bf374d39"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.830715 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c3f19a04-1d20-417d-aa55-dbc2bf374d39" (UID: "c3f19a04-1d20-417d-aa55-dbc2bf374d39"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.831698 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-config" (OuterVolumeSpecName: "config") pod "c3f19a04-1d20-417d-aa55-dbc2bf374d39" (UID: "c3f19a04-1d20-417d-aa55-dbc2bf374d39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.832180 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "c3f19a04-1d20-417d-aa55-dbc2bf374d39" (UID: "c3f19a04-1d20-417d-aa55-dbc2bf374d39"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.891194 4786 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.891221 4786 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.891230 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.891238 4786 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.891246 4786 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.891255 4786 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3f19a04-1d20-417d-aa55-dbc2bf374d39-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:49 crc kubenswrapper[4786]: I1002 07:04:49.891262 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4b8l\" (UniqueName: \"kubernetes.io/projected/c3f19a04-1d20-417d-aa55-dbc2bf374d39-kube-api-access-r4b8l\") on node \"crc\" DevicePath \"\"" Oct 02 07:04:50 crc kubenswrapper[4786]: I1002 07:04:50.100856 4786 generic.go:334] "Generic (PLEG): container finished" podID="c3f19a04-1d20-417d-aa55-dbc2bf374d39" containerID="1055324ad376191b7bc5b7e6939caf90e6b27c20cd03bd50f8e09373dc84edd4" exitCode=0 Oct 02 07:04:50 crc kubenswrapper[4786]: I1002 07:04:50.100893 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" event={"ID":"c3f19a04-1d20-417d-aa55-dbc2bf374d39","Type":"ContainerDied","Data":"1055324ad376191b7bc5b7e6939caf90e6b27c20cd03bd50f8e09373dc84edd4"} Oct 02 07:04:50 crc kubenswrapper[4786]: I1002 07:04:50.100907 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" Oct 02 07:04:50 crc kubenswrapper[4786]: I1002 07:04:50.100926 4786 scope.go:117] "RemoveContainer" containerID="1055324ad376191b7bc5b7e6939caf90e6b27c20cd03bd50f8e09373dc84edd4" Oct 02 07:04:50 crc kubenswrapper[4786]: I1002 07:04:50.100916 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c6ddb9ff-88b27" event={"ID":"c3f19a04-1d20-417d-aa55-dbc2bf374d39","Type":"ContainerDied","Data":"a39055bc6e98afb3923258d3c36fd1566891ec54b0d1b6c2a6d03f2de5f74eb9"} Oct 02 07:04:50 crc kubenswrapper[4786]: I1002 07:04:50.116928 4786 scope.go:117] "RemoveContainer" containerID="1747c6564d3607cc32a3d7fff56e61534c13270b863f7c85f83e508a88586547" Oct 02 07:04:50 crc kubenswrapper[4786]: I1002 07:04:50.123971 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76c6ddb9ff-88b27"] Oct 02 07:04:50 crc kubenswrapper[4786]: I1002 07:04:50.130173 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76c6ddb9ff-88b27"] Oct 02 07:04:50 crc kubenswrapper[4786]: I1002 07:04:50.132849 4786 scope.go:117] "RemoveContainer" containerID="1055324ad376191b7bc5b7e6939caf90e6b27c20cd03bd50f8e09373dc84edd4" Oct 02 07:04:50 crc kubenswrapper[4786]: E1002 07:04:50.133160 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1055324ad376191b7bc5b7e6939caf90e6b27c20cd03bd50f8e09373dc84edd4\": container with ID starting with 1055324ad376191b7bc5b7e6939caf90e6b27c20cd03bd50f8e09373dc84edd4 not found: ID does not exist" containerID="1055324ad376191b7bc5b7e6939caf90e6b27c20cd03bd50f8e09373dc84edd4" Oct 02 07:04:50 crc kubenswrapper[4786]: I1002 07:04:50.133197 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1055324ad376191b7bc5b7e6939caf90e6b27c20cd03bd50f8e09373dc84edd4"} err="failed to get container status \"1055324ad376191b7bc5b7e6939caf90e6b27c20cd03bd50f8e09373dc84edd4\": rpc error: code = NotFound desc = could not find container \"1055324ad376191b7bc5b7e6939caf90e6b27c20cd03bd50f8e09373dc84edd4\": container with ID starting with 1055324ad376191b7bc5b7e6939caf90e6b27c20cd03bd50f8e09373dc84edd4 not found: ID does not exist" Oct 02 07:04:50 crc kubenswrapper[4786]: I1002 07:04:50.133223 4786 scope.go:117] "RemoveContainer" containerID="1747c6564d3607cc32a3d7fff56e61534c13270b863f7c85f83e508a88586547" Oct 02 07:04:50 crc kubenswrapper[4786]: E1002 07:04:50.133511 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1747c6564d3607cc32a3d7fff56e61534c13270b863f7c85f83e508a88586547\": container with ID starting with 1747c6564d3607cc32a3d7fff56e61534c13270b863f7c85f83e508a88586547 not found: ID does not exist" containerID="1747c6564d3607cc32a3d7fff56e61534c13270b863f7c85f83e508a88586547" Oct 02 07:04:50 crc kubenswrapper[4786]: I1002 07:04:50.133544 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1747c6564d3607cc32a3d7fff56e61534c13270b863f7c85f83e508a88586547"} err="failed to get container status \"1747c6564d3607cc32a3d7fff56e61534c13270b863f7c85f83e508a88586547\": rpc error: code = NotFound desc = could not find container \"1747c6564d3607cc32a3d7fff56e61534c13270b863f7c85f83e508a88586547\": container with ID starting with 1747c6564d3607cc32a3d7fff56e61534c13270b863f7c85f83e508a88586547 not found: ID does not exist" Oct 02 07:04:50 crc kubenswrapper[4786]: I1002 07:04:50.187194 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3f19a04-1d20-417d-aa55-dbc2bf374d39" path="/var/lib/kubelet/pods/c3f19a04-1d20-417d-aa55-dbc2bf374d39/volumes" Oct 02 07:04:57 crc kubenswrapper[4786]: I1002 07:04:57.497301 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:04:57 crc kubenswrapper[4786]: I1002 07:04:57.497762 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:04:57 crc kubenswrapper[4786]: I1002 07:04:57.955165 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k"] Oct 02 07:04:57 crc kubenswrapper[4786]: E1002 07:04:57.955706 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa599fa1-31c3-4f98-a942-c67de6fe96e7" containerName="init" Oct 02 07:04:57 crc kubenswrapper[4786]: I1002 07:04:57.955803 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa599fa1-31c3-4f98-a942-c67de6fe96e7" containerName="init" Oct 02 07:04:57 crc kubenswrapper[4786]: E1002 07:04:57.955884 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa599fa1-31c3-4f98-a942-c67de6fe96e7" containerName="dnsmasq-dns" Oct 02 07:04:57 crc kubenswrapper[4786]: I1002 07:04:57.955939 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa599fa1-31c3-4f98-a942-c67de6fe96e7" containerName="dnsmasq-dns" Oct 02 07:04:57 crc kubenswrapper[4786]: E1002 07:04:57.955989 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3f19a04-1d20-417d-aa55-dbc2bf374d39" containerName="init" Oct 02 07:04:57 crc kubenswrapper[4786]: I1002 07:04:57.956032 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3f19a04-1d20-417d-aa55-dbc2bf374d39" containerName="init" Oct 02 07:04:57 crc kubenswrapper[4786]: E1002 07:04:57.956083 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3f19a04-1d20-417d-aa55-dbc2bf374d39" containerName="dnsmasq-dns" Oct 02 07:04:57 crc kubenswrapper[4786]: I1002 07:04:57.959764 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3f19a04-1d20-417d-aa55-dbc2bf374d39" containerName="dnsmasq-dns" Oct 02 07:04:57 crc kubenswrapper[4786]: I1002 07:04:57.960102 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa599fa1-31c3-4f98-a942-c67de6fe96e7" containerName="dnsmasq-dns" Oct 02 07:04:57 crc kubenswrapper[4786]: I1002 07:04:57.960135 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3f19a04-1d20-417d-aa55-dbc2bf374d39" containerName="dnsmasq-dns" Oct 02 07:04:57 crc kubenswrapper[4786]: I1002 07:04:57.960735 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" Oct 02 07:04:57 crc kubenswrapper[4786]: I1002 07:04:57.962717 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hxdbp" Oct 02 07:04:57 crc kubenswrapper[4786]: I1002 07:04:57.962840 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 07:04:57 crc kubenswrapper[4786]: I1002 07:04:57.962859 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 07:04:57 crc kubenswrapper[4786]: I1002 07:04:57.963024 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k"] Oct 02 07:04:57 crc kubenswrapper[4786]: I1002 07:04:57.963245 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 07:04:58 crc kubenswrapper[4786]: I1002 07:04:58.000949 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6nvn\" (UniqueName: \"kubernetes.io/projected/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-kube-api-access-q6nvn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c879k\" (UID: \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" Oct 02 07:04:58 crc kubenswrapper[4786]: I1002 07:04:58.001041 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c879k\" (UID: \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" Oct 02 07:04:58 crc kubenswrapper[4786]: I1002 07:04:58.001088 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c879k\" (UID: \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" Oct 02 07:04:58 crc kubenswrapper[4786]: I1002 07:04:58.001115 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c879k\" (UID: \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" Oct 02 07:04:58 crc kubenswrapper[4786]: I1002 07:04:58.102362 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6nvn\" (UniqueName: \"kubernetes.io/projected/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-kube-api-access-q6nvn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c879k\" (UID: \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" Oct 02 07:04:58 crc kubenswrapper[4786]: I1002 07:04:58.102436 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c879k\" (UID: \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" Oct 02 07:04:58 crc kubenswrapper[4786]: I1002 07:04:58.102482 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c879k\" (UID: \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" Oct 02 07:04:58 crc kubenswrapper[4786]: I1002 07:04:58.102511 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c879k\" (UID: \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" Oct 02 07:04:58 crc kubenswrapper[4786]: I1002 07:04:58.106934 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c879k\" (UID: \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" Oct 02 07:04:58 crc kubenswrapper[4786]: I1002 07:04:58.107333 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c879k\" (UID: \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" Oct 02 07:04:58 crc kubenswrapper[4786]: I1002 07:04:58.107960 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c879k\" (UID: \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" Oct 02 07:04:58 crc kubenswrapper[4786]: I1002 07:04:58.115229 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6nvn\" (UniqueName: \"kubernetes.io/projected/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-kube-api-access-q6nvn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c879k\" (UID: \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" Oct 02 07:04:58 crc kubenswrapper[4786]: I1002 07:04:58.274397 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" Oct 02 07:04:58 crc kubenswrapper[4786]: I1002 07:04:58.677444 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k"] Oct 02 07:04:59 crc kubenswrapper[4786]: I1002 07:04:59.158183 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" event={"ID":"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11","Type":"ContainerStarted","Data":"f9931a1e2be3dc04bc7347fe99cbceab25fed74f246545b04f531ebfa833fedb"} Oct 02 07:05:01 crc kubenswrapper[4786]: I1002 07:05:01.176530 4786 generic.go:334] "Generic (PLEG): container finished" podID="581ccdf3-7a38-4bb7-93ac-035207098fb7" containerID="ba974aac15463b55d20f92e85860fb962ba14a1d0545ed24a189d6eecd57ef3e" exitCode=0 Oct 02 07:05:01 crc kubenswrapper[4786]: I1002 07:05:01.176602 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"581ccdf3-7a38-4bb7-93ac-035207098fb7","Type":"ContainerDied","Data":"ba974aac15463b55d20f92e85860fb962ba14a1d0545ed24a189d6eecd57ef3e"} Oct 02 07:05:02 crc kubenswrapper[4786]: I1002 07:05:02.184375 4786 generic.go:334] "Generic (PLEG): container finished" podID="00373ed3-3b08-4040-9e05-fd042f541af6" containerID="daace577174f9f0c7ab0d422ef942399323c612416489aecd53c5596609c03e3" exitCode=0 Oct 02 07:05:02 crc kubenswrapper[4786]: I1002 07:05:02.188230 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"00373ed3-3b08-4040-9e05-fd042f541af6","Type":"ContainerDied","Data":"daace577174f9f0c7ab0d422ef942399323c612416489aecd53c5596609c03e3"} Oct 02 07:05:06 crc kubenswrapper[4786]: I1002 07:05:06.211637 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"581ccdf3-7a38-4bb7-93ac-035207098fb7","Type":"ContainerStarted","Data":"0a124a49b844756713d4519dd9f692ba03c7927a52d47516b3fc1d76d17418e0"} Oct 02 07:05:06 crc kubenswrapper[4786]: I1002 07:05:06.212113 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 02 07:05:06 crc kubenswrapper[4786]: I1002 07:05:06.213199 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" event={"ID":"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11","Type":"ContainerStarted","Data":"400f2ca41d48b5254f8df9f64dd8d42200ec204fe63e161c52f46fa5100bd5ff"} Oct 02 07:05:06 crc kubenswrapper[4786]: I1002 07:05:06.214713 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"00373ed3-3b08-4040-9e05-fd042f541af6","Type":"ContainerStarted","Data":"014248134eed5f6a1f68c2546795f667df983975b10760b6e957710d5cf07f09"} Oct 02 07:05:06 crc kubenswrapper[4786]: I1002 07:05:06.215125 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:05:06 crc kubenswrapper[4786]: I1002 07:05:06.243717 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.243683981 podStartE2EDuration="39.243683981s" podCreationTimestamp="2025-10-02 07:04:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:05:06.226882536 +0000 UTC m=+1116.348065677" watchObservedRunningTime="2025-10-02 07:05:06.243683981 +0000 UTC m=+1116.364867113" Oct 02 07:05:06 crc kubenswrapper[4786]: I1002 07:05:06.261048 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.261035352 podStartE2EDuration="38.261035352s" podCreationTimestamp="2025-10-02 07:04:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:05:06.254980922 +0000 UTC m=+1116.376164063" watchObservedRunningTime="2025-10-02 07:05:06.261035352 +0000 UTC m=+1116.382218483" Oct 02 07:05:06 crc kubenswrapper[4786]: I1002 07:05:06.272841 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" podStartSLOduration=2.906683519 podStartE2EDuration="9.272824619s" podCreationTimestamp="2025-10-02 07:04:57 +0000 UTC" firstStartedPulling="2025-10-02 07:04:58.680104656 +0000 UTC m=+1108.801287787" lastFinishedPulling="2025-10-02 07:05:05.046245757 +0000 UTC m=+1115.167428887" observedRunningTime="2025-10-02 07:05:06.269567276 +0000 UTC m=+1116.390750417" watchObservedRunningTime="2025-10-02 07:05:06.272824619 +0000 UTC m=+1116.394007750" Oct 02 07:05:17 crc kubenswrapper[4786]: I1002 07:05:17.280935 4786 generic.go:334] "Generic (PLEG): container finished" podID="5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11" containerID="400f2ca41d48b5254f8df9f64dd8d42200ec204fe63e161c52f46fa5100bd5ff" exitCode=0 Oct 02 07:05:17 crc kubenswrapper[4786]: I1002 07:05:17.281013 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" event={"ID":"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11","Type":"ContainerDied","Data":"400f2ca41d48b5254f8df9f64dd8d42200ec204fe63e161c52f46fa5100bd5ff"} Oct 02 07:05:18 crc kubenswrapper[4786]: I1002 07:05:18.345854 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 02 07:05:18 crc kubenswrapper[4786]: I1002 07:05:18.623475 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" Oct 02 07:05:18 crc kubenswrapper[4786]: I1002 07:05:18.711168 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-ssh-key\") pod \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\" (UID: \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\") " Oct 02 07:05:18 crc kubenswrapper[4786]: I1002 07:05:18.711207 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-inventory\") pod \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\" (UID: \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\") " Oct 02 07:05:18 crc kubenswrapper[4786]: I1002 07:05:18.711751 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-repo-setup-combined-ca-bundle\") pod \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\" (UID: \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\") " Oct 02 07:05:18 crc kubenswrapper[4786]: I1002 07:05:18.712075 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6nvn\" (UniqueName: \"kubernetes.io/projected/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-kube-api-access-q6nvn\") pod \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\" (UID: \"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11\") " Oct 02 07:05:18 crc kubenswrapper[4786]: I1002 07:05:18.717094 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-kube-api-access-q6nvn" (OuterVolumeSpecName: "kube-api-access-q6nvn") pod "5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11" (UID: "5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11"). InnerVolumeSpecName "kube-api-access-q6nvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:05:18 crc kubenswrapper[4786]: I1002 07:05:18.718370 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11" (UID: "5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:05:18 crc kubenswrapper[4786]: I1002 07:05:18.733708 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11" (UID: "5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:05:18 crc kubenswrapper[4786]: I1002 07:05:18.735861 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-inventory" (OuterVolumeSpecName: "inventory") pod "5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11" (UID: "5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:05:18 crc kubenswrapper[4786]: I1002 07:05:18.814114 4786 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 07:05:18 crc kubenswrapper[4786]: I1002 07:05:18.814141 4786 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 07:05:18 crc kubenswrapper[4786]: I1002 07:05:18.814151 4786 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:05:18 crc kubenswrapper[4786]: I1002 07:05:18.814160 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6nvn\" (UniqueName: \"kubernetes.io/projected/5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11-kube-api-access-q6nvn\") on node \"crc\" DevicePath \"\"" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.295816 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" event={"ID":"5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11","Type":"ContainerDied","Data":"f9931a1e2be3dc04bc7347fe99cbceab25fed74f246545b04f531ebfa833fedb"} Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.295852 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9931a1e2be3dc04bc7347fe99cbceab25fed74f246545b04f531ebfa833fedb" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.295900 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c879k" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.323882 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.363405 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk"] Oct 02 07:05:19 crc kubenswrapper[4786]: E1002 07:05:19.363735 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.363747 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.363941 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.364445 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.367983 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.368047 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hxdbp" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.368211 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.369713 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.375748 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk"] Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.526338 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctrpt\" (UniqueName: \"kubernetes.io/projected/daf8cf99-f61d-4fc6-bc92-045acafed529-kube-api-access-ctrpt\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jfhsk\" (UID: \"daf8cf99-f61d-4fc6-bc92-045acafed529\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.526543 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/daf8cf99-f61d-4fc6-bc92-045acafed529-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jfhsk\" (UID: \"daf8cf99-f61d-4fc6-bc92-045acafed529\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.526574 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daf8cf99-f61d-4fc6-bc92-045acafed529-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jfhsk\" (UID: \"daf8cf99-f61d-4fc6-bc92-045acafed529\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.628048 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctrpt\" (UniqueName: \"kubernetes.io/projected/daf8cf99-f61d-4fc6-bc92-045acafed529-kube-api-access-ctrpt\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jfhsk\" (UID: \"daf8cf99-f61d-4fc6-bc92-045acafed529\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.628255 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/daf8cf99-f61d-4fc6-bc92-045acafed529-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jfhsk\" (UID: \"daf8cf99-f61d-4fc6-bc92-045acafed529\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.628627 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daf8cf99-f61d-4fc6-bc92-045acafed529-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jfhsk\" (UID: \"daf8cf99-f61d-4fc6-bc92-045acafed529\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.631242 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/daf8cf99-f61d-4fc6-bc92-045acafed529-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jfhsk\" (UID: \"daf8cf99-f61d-4fc6-bc92-045acafed529\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.631456 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daf8cf99-f61d-4fc6-bc92-045acafed529-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jfhsk\" (UID: \"daf8cf99-f61d-4fc6-bc92-045acafed529\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.641470 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctrpt\" (UniqueName: \"kubernetes.io/projected/daf8cf99-f61d-4fc6-bc92-045acafed529-kube-api-access-ctrpt\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jfhsk\" (UID: \"daf8cf99-f61d-4fc6-bc92-045acafed529\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk" Oct 02 07:05:19 crc kubenswrapper[4786]: I1002 07:05:19.687620 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk" Oct 02 07:05:20 crc kubenswrapper[4786]: I1002 07:05:20.099240 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk"] Oct 02 07:05:20 crc kubenswrapper[4786]: W1002 07:05:20.102664 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddaf8cf99_f61d_4fc6_bc92_045acafed529.slice/crio-a8303bac7e28f17024d1732ce7d14e3394c2bc4053389ac3bdbee158691ff1a8 WatchSource:0}: Error finding container a8303bac7e28f17024d1732ce7d14e3394c2bc4053389ac3bdbee158691ff1a8: Status 404 returned error can't find the container with id a8303bac7e28f17024d1732ce7d14e3394c2bc4053389ac3bdbee158691ff1a8 Oct 02 07:05:20 crc kubenswrapper[4786]: I1002 07:05:20.104848 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 07:05:20 crc kubenswrapper[4786]: I1002 07:05:20.303032 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk" event={"ID":"daf8cf99-f61d-4fc6-bc92-045acafed529","Type":"ContainerStarted","Data":"a8303bac7e28f17024d1732ce7d14e3394c2bc4053389ac3bdbee158691ff1a8"} Oct 02 07:05:21 crc kubenswrapper[4786]: I1002 07:05:21.310685 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk" event={"ID":"daf8cf99-f61d-4fc6-bc92-045acafed529","Type":"ContainerStarted","Data":"dbdc6ad7a3cf201d9edd6383a39ac6866b98b7becceaa794c5a8e149137c587f"} Oct 02 07:05:21 crc kubenswrapper[4786]: I1002 07:05:21.320944 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk" podStartSLOduration=1.791781427 podStartE2EDuration="2.320935153s" podCreationTimestamp="2025-10-02 07:05:19 +0000 UTC" firstStartedPulling="2025-10-02 07:05:20.104620908 +0000 UTC m=+1130.225804039" lastFinishedPulling="2025-10-02 07:05:20.633774635 +0000 UTC m=+1130.754957765" observedRunningTime="2025-10-02 07:05:21.320522175 +0000 UTC m=+1131.441705326" watchObservedRunningTime="2025-10-02 07:05:21.320935153 +0000 UTC m=+1131.442118285" Oct 02 07:05:23 crc kubenswrapper[4786]: I1002 07:05:23.325247 4786 generic.go:334] "Generic (PLEG): container finished" podID="daf8cf99-f61d-4fc6-bc92-045acafed529" containerID="dbdc6ad7a3cf201d9edd6383a39ac6866b98b7becceaa794c5a8e149137c587f" exitCode=0 Oct 02 07:05:23 crc kubenswrapper[4786]: I1002 07:05:23.325322 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk" event={"ID":"daf8cf99-f61d-4fc6-bc92-045acafed529","Type":"ContainerDied","Data":"dbdc6ad7a3cf201d9edd6383a39ac6866b98b7becceaa794c5a8e149137c587f"} Oct 02 07:05:24 crc kubenswrapper[4786]: I1002 07:05:24.612239 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk" Oct 02 07:05:24 crc kubenswrapper[4786]: I1002 07:05:24.802555 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daf8cf99-f61d-4fc6-bc92-045acafed529-inventory\") pod \"daf8cf99-f61d-4fc6-bc92-045acafed529\" (UID: \"daf8cf99-f61d-4fc6-bc92-045acafed529\") " Oct 02 07:05:24 crc kubenswrapper[4786]: I1002 07:05:24.802672 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctrpt\" (UniqueName: \"kubernetes.io/projected/daf8cf99-f61d-4fc6-bc92-045acafed529-kube-api-access-ctrpt\") pod \"daf8cf99-f61d-4fc6-bc92-045acafed529\" (UID: \"daf8cf99-f61d-4fc6-bc92-045acafed529\") " Oct 02 07:05:24 crc kubenswrapper[4786]: I1002 07:05:24.802726 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/daf8cf99-f61d-4fc6-bc92-045acafed529-ssh-key\") pod \"daf8cf99-f61d-4fc6-bc92-045acafed529\" (UID: \"daf8cf99-f61d-4fc6-bc92-045acafed529\") " Oct 02 07:05:24 crc kubenswrapper[4786]: I1002 07:05:24.806356 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daf8cf99-f61d-4fc6-bc92-045acafed529-kube-api-access-ctrpt" (OuterVolumeSpecName: "kube-api-access-ctrpt") pod "daf8cf99-f61d-4fc6-bc92-045acafed529" (UID: "daf8cf99-f61d-4fc6-bc92-045acafed529"). InnerVolumeSpecName "kube-api-access-ctrpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:05:24 crc kubenswrapper[4786]: E1002 07:05:24.820277 4786 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/daf8cf99-f61d-4fc6-bc92-045acafed529-ssh-key podName:daf8cf99-f61d-4fc6-bc92-045acafed529 nodeName:}" failed. No retries permitted until 2025-10-02 07:05:25.320252496 +0000 UTC m=+1135.441435627 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key" (UniqueName: "kubernetes.io/secret/daf8cf99-f61d-4fc6-bc92-045acafed529-ssh-key") pod "daf8cf99-f61d-4fc6-bc92-045acafed529" (UID: "daf8cf99-f61d-4fc6-bc92-045acafed529") : error deleting /var/lib/kubelet/pods/daf8cf99-f61d-4fc6-bc92-045acafed529/volume-subpaths: remove /var/lib/kubelet/pods/daf8cf99-f61d-4fc6-bc92-045acafed529/volume-subpaths: no such file or directory Oct 02 07:05:24 crc kubenswrapper[4786]: I1002 07:05:24.822046 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daf8cf99-f61d-4fc6-bc92-045acafed529-inventory" (OuterVolumeSpecName: "inventory") pod "daf8cf99-f61d-4fc6-bc92-045acafed529" (UID: "daf8cf99-f61d-4fc6-bc92-045acafed529"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:05:24 crc kubenswrapper[4786]: I1002 07:05:24.905137 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctrpt\" (UniqueName: \"kubernetes.io/projected/daf8cf99-f61d-4fc6-bc92-045acafed529-kube-api-access-ctrpt\") on node \"crc\" DevicePath \"\"" Oct 02 07:05:24 crc kubenswrapper[4786]: I1002 07:05:24.905236 4786 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/daf8cf99-f61d-4fc6-bc92-045acafed529-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.338997 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk" event={"ID":"daf8cf99-f61d-4fc6-bc92-045acafed529","Type":"ContainerDied","Data":"a8303bac7e28f17024d1732ce7d14e3394c2bc4053389ac3bdbee158691ff1a8"} Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.339208 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8303bac7e28f17024d1732ce7d14e3394c2bc4053389ac3bdbee158691ff1a8" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.339024 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jfhsk" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.379481 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf"] Oct 02 07:05:25 crc kubenswrapper[4786]: E1002 07:05:25.379819 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daf8cf99-f61d-4fc6-bc92-045acafed529" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.379841 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf8cf99-f61d-4fc6-bc92-045acafed529" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.380004 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="daf8cf99-f61d-4fc6-bc92-045acafed529" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.380509 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.386790 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf"] Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.410361 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/daf8cf99-f61d-4fc6-bc92-045acafed529-ssh-key\") pod \"daf8cf99-f61d-4fc6-bc92-045acafed529\" (UID: \"daf8cf99-f61d-4fc6-bc92-045acafed529\") " Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.412731 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daf8cf99-f61d-4fc6-bc92-045acafed529-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "daf8cf99-f61d-4fc6-bc92-045acafed529" (UID: "daf8cf99-f61d-4fc6-bc92-045acafed529"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.512939 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bd6d4c9-950c-4051-a20e-cfae8655dab2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf\" (UID: \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.513176 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bd6d4c9-950c-4051-a20e-cfae8655dab2-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf\" (UID: \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.513225 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv6rx\" (UniqueName: \"kubernetes.io/projected/2bd6d4c9-950c-4051-a20e-cfae8655dab2-kube-api-access-nv6rx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf\" (UID: \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.513308 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bd6d4c9-950c-4051-a20e-cfae8655dab2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf\" (UID: \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.513448 4786 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/daf8cf99-f61d-4fc6-bc92-045acafed529-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.614674 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bd6d4c9-950c-4051-a20e-cfae8655dab2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf\" (UID: \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.614874 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bd6d4c9-950c-4051-a20e-cfae8655dab2-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf\" (UID: \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.614900 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv6rx\" (UniqueName: \"kubernetes.io/projected/2bd6d4c9-950c-4051-a20e-cfae8655dab2-kube-api-access-nv6rx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf\" (UID: \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.614940 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bd6d4c9-950c-4051-a20e-cfae8655dab2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf\" (UID: \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.617402 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bd6d4c9-950c-4051-a20e-cfae8655dab2-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf\" (UID: \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.618041 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bd6d4c9-950c-4051-a20e-cfae8655dab2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf\" (UID: \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.618099 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bd6d4c9-950c-4051-a20e-cfae8655dab2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf\" (UID: \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.628098 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv6rx\" (UniqueName: \"kubernetes.io/projected/2bd6d4c9-950c-4051-a20e-cfae8655dab2-kube-api-access-nv6rx\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf\" (UID: \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" Oct 02 07:05:25 crc kubenswrapper[4786]: I1002 07:05:25.699783 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" Oct 02 07:05:26 crc kubenswrapper[4786]: W1002 07:05:26.105321 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bd6d4c9_950c_4051_a20e_cfae8655dab2.slice/crio-336c3b5bab280318f928a86fca21d199918771db1d1a8bd2d86370afc0eb91f7 WatchSource:0}: Error finding container 336c3b5bab280318f928a86fca21d199918771db1d1a8bd2d86370afc0eb91f7: Status 404 returned error can't find the container with id 336c3b5bab280318f928a86fca21d199918771db1d1a8bd2d86370afc0eb91f7 Oct 02 07:05:26 crc kubenswrapper[4786]: I1002 07:05:26.105657 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf"] Oct 02 07:05:26 crc kubenswrapper[4786]: I1002 07:05:26.347621 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" event={"ID":"2bd6d4c9-950c-4051-a20e-cfae8655dab2","Type":"ContainerStarted","Data":"336c3b5bab280318f928a86fca21d199918771db1d1a8bd2d86370afc0eb91f7"} Oct 02 07:05:27 crc kubenswrapper[4786]: I1002 07:05:27.355355 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" event={"ID":"2bd6d4c9-950c-4051-a20e-cfae8655dab2","Type":"ContainerStarted","Data":"7408c611b6265a73c67c6fe9d857902dbd89d8794352a2ae0fbe71e68106932b"} Oct 02 07:05:27 crc kubenswrapper[4786]: I1002 07:05:27.368088 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" podStartSLOduration=1.783869867 podStartE2EDuration="2.368073377s" podCreationTimestamp="2025-10-02 07:05:25 +0000 UTC" firstStartedPulling="2025-10-02 07:05:26.107185024 +0000 UTC m=+1136.228368155" lastFinishedPulling="2025-10-02 07:05:26.691388534 +0000 UTC m=+1136.812571665" observedRunningTime="2025-10-02 07:05:27.364683234 +0000 UTC m=+1137.485866375" watchObservedRunningTime="2025-10-02 07:05:27.368073377 +0000 UTC m=+1137.489256508" Oct 02 07:05:27 crc kubenswrapper[4786]: I1002 07:05:27.497860 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:05:27 crc kubenswrapper[4786]: I1002 07:05:27.497908 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:05:57 crc kubenswrapper[4786]: I1002 07:05:57.497364 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:05:57 crc kubenswrapper[4786]: I1002 07:05:57.498253 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:05:57 crc kubenswrapper[4786]: I1002 07:05:57.498347 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 07:05:57 crc kubenswrapper[4786]: I1002 07:05:57.498817 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"090157c59cd3f7df3b613a19c4694de2405b30b33d0b408048a029fe6c54264a"} pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 07:05:57 crc kubenswrapper[4786]: I1002 07:05:57.498937 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" containerID="cri-o://090157c59cd3f7df3b613a19c4694de2405b30b33d0b408048a029fe6c54264a" gracePeriod=600 Oct 02 07:05:58 crc kubenswrapper[4786]: I1002 07:05:58.546361 4786 generic.go:334] "Generic (PLEG): container finished" podID="79cb22df-4930-4aed-9108-1056074d1000" containerID="090157c59cd3f7df3b613a19c4694de2405b30b33d0b408048a029fe6c54264a" exitCode=0 Oct 02 07:05:58 crc kubenswrapper[4786]: I1002 07:05:58.546414 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerDied","Data":"090157c59cd3f7df3b613a19c4694de2405b30b33d0b408048a029fe6c54264a"} Oct 02 07:05:58 crc kubenswrapper[4786]: I1002 07:05:58.546945 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerStarted","Data":"6bed9a44163665a582af239dfca9191384ef1aa472fc58dcc47dcd9b608c85c4"} Oct 02 07:05:58 crc kubenswrapper[4786]: I1002 07:05:58.546968 4786 scope.go:117] "RemoveContainer" containerID="d7bde13c3a2f638d163652c020a2e40b2d8399d146317237502071d1e44c36be" Oct 02 07:07:38 crc kubenswrapper[4786]: I1002 07:07:38.101066 4786 scope.go:117] "RemoveContainer" containerID="da6420d39a5a07d64561c400d6cad60810b13256e8b3defafa02bf67865c0716" Oct 02 07:07:38 crc kubenswrapper[4786]: I1002 07:07:38.122412 4786 scope.go:117] "RemoveContainer" containerID="fdbd48719a6a31bfd0cbbfe5cb723475e0d8c36f5c3df1d981ed4ce4e9def17f" Oct 02 07:07:57 crc kubenswrapper[4786]: I1002 07:07:57.497015 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:07:57 crc kubenswrapper[4786]: I1002 07:07:57.497316 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:08:23 crc kubenswrapper[4786]: I1002 07:08:23.482862 4786 generic.go:334] "Generic (PLEG): container finished" podID="2bd6d4c9-950c-4051-a20e-cfae8655dab2" containerID="7408c611b6265a73c67c6fe9d857902dbd89d8794352a2ae0fbe71e68106932b" exitCode=0 Oct 02 07:08:23 crc kubenswrapper[4786]: I1002 07:08:23.482933 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" event={"ID":"2bd6d4c9-950c-4051-a20e-cfae8655dab2","Type":"ContainerDied","Data":"7408c611b6265a73c67c6fe9d857902dbd89d8794352a2ae0fbe71e68106932b"} Oct 02 07:08:24 crc kubenswrapper[4786]: I1002 07:08:24.775036 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" Oct 02 07:08:24 crc kubenswrapper[4786]: I1002 07:08:24.969893 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bd6d4c9-950c-4051-a20e-cfae8655dab2-inventory\") pod \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\" (UID: \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\") " Oct 02 07:08:24 crc kubenswrapper[4786]: I1002 07:08:24.969950 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv6rx\" (UniqueName: \"kubernetes.io/projected/2bd6d4c9-950c-4051-a20e-cfae8655dab2-kube-api-access-nv6rx\") pod \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\" (UID: \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\") " Oct 02 07:08:24 crc kubenswrapper[4786]: I1002 07:08:24.969983 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bd6d4c9-950c-4051-a20e-cfae8655dab2-ssh-key\") pod \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\" (UID: \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\") " Oct 02 07:08:24 crc kubenswrapper[4786]: I1002 07:08:24.970051 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bd6d4c9-950c-4051-a20e-cfae8655dab2-bootstrap-combined-ca-bundle\") pod \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\" (UID: \"2bd6d4c9-950c-4051-a20e-cfae8655dab2\") " Oct 02 07:08:24 crc kubenswrapper[4786]: I1002 07:08:24.974598 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bd6d4c9-950c-4051-a20e-cfae8655dab2-kube-api-access-nv6rx" (OuterVolumeSpecName: "kube-api-access-nv6rx") pod "2bd6d4c9-950c-4051-a20e-cfae8655dab2" (UID: "2bd6d4c9-950c-4051-a20e-cfae8655dab2"). InnerVolumeSpecName "kube-api-access-nv6rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:08:24 crc kubenswrapper[4786]: I1002 07:08:24.974838 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bd6d4c9-950c-4051-a20e-cfae8655dab2-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2bd6d4c9-950c-4051-a20e-cfae8655dab2" (UID: "2bd6d4c9-950c-4051-a20e-cfae8655dab2"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:08:24 crc kubenswrapper[4786]: I1002 07:08:24.990237 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bd6d4c9-950c-4051-a20e-cfae8655dab2-inventory" (OuterVolumeSpecName: "inventory") pod "2bd6d4c9-950c-4051-a20e-cfae8655dab2" (UID: "2bd6d4c9-950c-4051-a20e-cfae8655dab2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:08:24 crc kubenswrapper[4786]: I1002 07:08:24.990905 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bd6d4c9-950c-4051-a20e-cfae8655dab2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2bd6d4c9-950c-4051-a20e-cfae8655dab2" (UID: "2bd6d4c9-950c-4051-a20e-cfae8655dab2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.071585 4786 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bd6d4c9-950c-4051-a20e-cfae8655dab2-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.071605 4786 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bd6d4c9-950c-4051-a20e-cfae8655dab2-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.071614 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv6rx\" (UniqueName: \"kubernetes.io/projected/2bd6d4c9-950c-4051-a20e-cfae8655dab2-kube-api-access-nv6rx\") on node \"crc\" DevicePath \"\"" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.071623 4786 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bd6d4c9-950c-4051-a20e-cfae8655dab2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.494929 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" event={"ID":"2bd6d4c9-950c-4051-a20e-cfae8655dab2","Type":"ContainerDied","Data":"336c3b5bab280318f928a86fca21d199918771db1d1a8bd2d86370afc0eb91f7"} Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.494968 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="336c3b5bab280318f928a86fca21d199918771db1d1a8bd2d86370afc0eb91f7" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.494967 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.546633 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw"] Oct 02 07:08:25 crc kubenswrapper[4786]: E1002 07:08:25.547224 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd6d4c9-950c-4051-a20e-cfae8655dab2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.547250 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd6d4c9-950c-4051-a20e-cfae8655dab2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.547457 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd6d4c9-950c-4051-a20e-cfae8655dab2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.548026 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.549853 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hxdbp" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.549994 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.550292 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.551283 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.553131 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw"] Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.578936 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa4315e2-27f5-4b58-91a6-1d5b683c6e55-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw\" (UID: \"aa4315e2-27f5-4b58-91a6-1d5b683c6e55\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.579003 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa4315e2-27f5-4b58-91a6-1d5b683c6e55-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw\" (UID: \"aa4315e2-27f5-4b58-91a6-1d5b683c6e55\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.579234 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct49x\" (UniqueName: \"kubernetes.io/projected/aa4315e2-27f5-4b58-91a6-1d5b683c6e55-kube-api-access-ct49x\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw\" (UID: \"aa4315e2-27f5-4b58-91a6-1d5b683c6e55\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.680661 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct49x\" (UniqueName: \"kubernetes.io/projected/aa4315e2-27f5-4b58-91a6-1d5b683c6e55-kube-api-access-ct49x\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw\" (UID: \"aa4315e2-27f5-4b58-91a6-1d5b683c6e55\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.680808 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa4315e2-27f5-4b58-91a6-1d5b683c6e55-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw\" (UID: \"aa4315e2-27f5-4b58-91a6-1d5b683c6e55\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.680895 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa4315e2-27f5-4b58-91a6-1d5b683c6e55-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw\" (UID: \"aa4315e2-27f5-4b58-91a6-1d5b683c6e55\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.684895 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa4315e2-27f5-4b58-91a6-1d5b683c6e55-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw\" (UID: \"aa4315e2-27f5-4b58-91a6-1d5b683c6e55\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.684972 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa4315e2-27f5-4b58-91a6-1d5b683c6e55-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw\" (UID: \"aa4315e2-27f5-4b58-91a6-1d5b683c6e55\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.693619 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct49x\" (UniqueName: \"kubernetes.io/projected/aa4315e2-27f5-4b58-91a6-1d5b683c6e55-kube-api-access-ct49x\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw\" (UID: \"aa4315e2-27f5-4b58-91a6-1d5b683c6e55\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw" Oct 02 07:08:25 crc kubenswrapper[4786]: I1002 07:08:25.861926 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw" Oct 02 07:08:26 crc kubenswrapper[4786]: I1002 07:08:26.282085 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw"] Oct 02 07:08:26 crc kubenswrapper[4786]: I1002 07:08:26.501775 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw" event={"ID":"aa4315e2-27f5-4b58-91a6-1d5b683c6e55","Type":"ContainerStarted","Data":"a52cd4a75e73fbad56781813e9e393dbdaf98260a777235181a1490236563ea1"} Oct 02 07:08:27 crc kubenswrapper[4786]: I1002 07:08:27.497060 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:08:27 crc kubenswrapper[4786]: I1002 07:08:27.497262 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:08:27 crc kubenswrapper[4786]: I1002 07:08:27.508485 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw" event={"ID":"aa4315e2-27f5-4b58-91a6-1d5b683c6e55","Type":"ContainerStarted","Data":"f89f85deca23f6bb636d150eae3f3cb368644d138b29bb0e1ec2dec0483c7e84"} Oct 02 07:08:27 crc kubenswrapper[4786]: I1002 07:08:27.522897 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw" podStartSLOduration=1.938832411 podStartE2EDuration="2.52288188s" podCreationTimestamp="2025-10-02 07:08:25 +0000 UTC" firstStartedPulling="2025-10-02 07:08:26.287087199 +0000 UTC m=+1316.408270331" lastFinishedPulling="2025-10-02 07:08:26.871136669 +0000 UTC m=+1316.992319800" observedRunningTime="2025-10-02 07:08:27.517725035 +0000 UTC m=+1317.638908176" watchObservedRunningTime="2025-10-02 07:08:27.52288188 +0000 UTC m=+1317.644065010" Oct 02 07:08:57 crc kubenswrapper[4786]: I1002 07:08:57.497439 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:08:57 crc kubenswrapper[4786]: I1002 07:08:57.498061 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:08:57 crc kubenswrapper[4786]: I1002 07:08:57.498123 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 07:08:57 crc kubenswrapper[4786]: I1002 07:08:57.498911 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6bed9a44163665a582af239dfca9191384ef1aa472fc58dcc47dcd9b608c85c4"} pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 07:08:57 crc kubenswrapper[4786]: I1002 07:08:57.498982 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" containerID="cri-o://6bed9a44163665a582af239dfca9191384ef1aa472fc58dcc47dcd9b608c85c4" gracePeriod=600 Oct 02 07:08:57 crc kubenswrapper[4786]: I1002 07:08:57.687567 4786 generic.go:334] "Generic (PLEG): container finished" podID="79cb22df-4930-4aed-9108-1056074d1000" containerID="6bed9a44163665a582af239dfca9191384ef1aa472fc58dcc47dcd9b608c85c4" exitCode=0 Oct 02 07:08:57 crc kubenswrapper[4786]: I1002 07:08:57.687605 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerDied","Data":"6bed9a44163665a582af239dfca9191384ef1aa472fc58dcc47dcd9b608c85c4"} Oct 02 07:08:57 crc kubenswrapper[4786]: I1002 07:08:57.687645 4786 scope.go:117] "RemoveContainer" containerID="090157c59cd3f7df3b613a19c4694de2405b30b33d0b408048a029fe6c54264a" Oct 02 07:08:58 crc kubenswrapper[4786]: I1002 07:08:58.695583 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerStarted","Data":"0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921"} Oct 02 07:10:10 crc kubenswrapper[4786]: I1002 07:10:10.634896 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5f2kc"] Oct 02 07:10:10 crc kubenswrapper[4786]: I1002 07:10:10.636733 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5f2kc" Oct 02 07:10:10 crc kubenswrapper[4786]: I1002 07:10:10.646646 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5f2kc"] Oct 02 07:10:10 crc kubenswrapper[4786]: I1002 07:10:10.778901 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdf394fd-3ebf-409a-84b2-ad054e544706-utilities\") pod \"redhat-operators-5f2kc\" (UID: \"cdf394fd-3ebf-409a-84b2-ad054e544706\") " pod="openshift-marketplace/redhat-operators-5f2kc" Oct 02 07:10:10 crc kubenswrapper[4786]: I1002 07:10:10.779141 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdf394fd-3ebf-409a-84b2-ad054e544706-catalog-content\") pod \"redhat-operators-5f2kc\" (UID: \"cdf394fd-3ebf-409a-84b2-ad054e544706\") " pod="openshift-marketplace/redhat-operators-5f2kc" Oct 02 07:10:10 crc kubenswrapper[4786]: I1002 07:10:10.779825 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kph65\" (UniqueName: \"kubernetes.io/projected/cdf394fd-3ebf-409a-84b2-ad054e544706-kube-api-access-kph65\") pod \"redhat-operators-5f2kc\" (UID: \"cdf394fd-3ebf-409a-84b2-ad054e544706\") " pod="openshift-marketplace/redhat-operators-5f2kc" Oct 02 07:10:10 crc kubenswrapper[4786]: I1002 07:10:10.881593 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kph65\" (UniqueName: \"kubernetes.io/projected/cdf394fd-3ebf-409a-84b2-ad054e544706-kube-api-access-kph65\") pod \"redhat-operators-5f2kc\" (UID: \"cdf394fd-3ebf-409a-84b2-ad054e544706\") " pod="openshift-marketplace/redhat-operators-5f2kc" Oct 02 07:10:10 crc kubenswrapper[4786]: I1002 07:10:10.881720 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdf394fd-3ebf-409a-84b2-ad054e544706-utilities\") pod \"redhat-operators-5f2kc\" (UID: \"cdf394fd-3ebf-409a-84b2-ad054e544706\") " pod="openshift-marketplace/redhat-operators-5f2kc" Oct 02 07:10:10 crc kubenswrapper[4786]: I1002 07:10:10.881775 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdf394fd-3ebf-409a-84b2-ad054e544706-catalog-content\") pod \"redhat-operators-5f2kc\" (UID: \"cdf394fd-3ebf-409a-84b2-ad054e544706\") " pod="openshift-marketplace/redhat-operators-5f2kc" Oct 02 07:10:10 crc kubenswrapper[4786]: I1002 07:10:10.882170 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdf394fd-3ebf-409a-84b2-ad054e544706-utilities\") pod \"redhat-operators-5f2kc\" (UID: \"cdf394fd-3ebf-409a-84b2-ad054e544706\") " pod="openshift-marketplace/redhat-operators-5f2kc" Oct 02 07:10:10 crc kubenswrapper[4786]: I1002 07:10:10.882218 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdf394fd-3ebf-409a-84b2-ad054e544706-catalog-content\") pod \"redhat-operators-5f2kc\" (UID: \"cdf394fd-3ebf-409a-84b2-ad054e544706\") " pod="openshift-marketplace/redhat-operators-5f2kc" Oct 02 07:10:10 crc kubenswrapper[4786]: I1002 07:10:10.897983 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kph65\" (UniqueName: \"kubernetes.io/projected/cdf394fd-3ebf-409a-84b2-ad054e544706-kube-api-access-kph65\") pod \"redhat-operators-5f2kc\" (UID: \"cdf394fd-3ebf-409a-84b2-ad054e544706\") " pod="openshift-marketplace/redhat-operators-5f2kc" Oct 02 07:10:10 crc kubenswrapper[4786]: I1002 07:10:10.951211 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5f2kc" Oct 02 07:10:11 crc kubenswrapper[4786]: I1002 07:10:11.346274 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5f2kc"] Oct 02 07:10:12 crc kubenswrapper[4786]: I1002 07:10:12.152438 4786 generic.go:334] "Generic (PLEG): container finished" podID="cdf394fd-3ebf-409a-84b2-ad054e544706" containerID="aaad6d84ecce7c09f4d89cd2e5be3b7fabc620d23007bc314ffbee366c925e44" exitCode=0 Oct 02 07:10:12 crc kubenswrapper[4786]: I1002 07:10:12.152523 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f2kc" event={"ID":"cdf394fd-3ebf-409a-84b2-ad054e544706","Type":"ContainerDied","Data":"aaad6d84ecce7c09f4d89cd2e5be3b7fabc620d23007bc314ffbee366c925e44"} Oct 02 07:10:12 crc kubenswrapper[4786]: I1002 07:10:12.152665 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f2kc" event={"ID":"cdf394fd-3ebf-409a-84b2-ad054e544706","Type":"ContainerStarted","Data":"0719c101cac7ed9e6ca1e8e4d5e483ec333d78924d53b170c9190192450366a7"} Oct 02 07:10:13 crc kubenswrapper[4786]: I1002 07:10:13.178333 4786 generic.go:334] "Generic (PLEG): container finished" podID="aa4315e2-27f5-4b58-91a6-1d5b683c6e55" containerID="f89f85deca23f6bb636d150eae3f3cb368644d138b29bb0e1ec2dec0483c7e84" exitCode=0 Oct 02 07:10:13 crc kubenswrapper[4786]: I1002 07:10:13.178404 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw" event={"ID":"aa4315e2-27f5-4b58-91a6-1d5b683c6e55","Type":"ContainerDied","Data":"f89f85deca23f6bb636d150eae3f3cb368644d138b29bb0e1ec2dec0483c7e84"} Oct 02 07:10:14 crc kubenswrapper[4786]: I1002 07:10:14.186916 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f2kc" event={"ID":"cdf394fd-3ebf-409a-84b2-ad054e544706","Type":"ContainerStarted","Data":"1fac4db663641acb2d6357121d178b17082bc1d146ed0f0914a2d404d7d31f52"} Oct 02 07:10:14 crc kubenswrapper[4786]: I1002 07:10:14.458520 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw" Oct 02 07:10:14 crc kubenswrapper[4786]: I1002 07:10:14.636504 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct49x\" (UniqueName: \"kubernetes.io/projected/aa4315e2-27f5-4b58-91a6-1d5b683c6e55-kube-api-access-ct49x\") pod \"aa4315e2-27f5-4b58-91a6-1d5b683c6e55\" (UID: \"aa4315e2-27f5-4b58-91a6-1d5b683c6e55\") " Oct 02 07:10:14 crc kubenswrapper[4786]: I1002 07:10:14.636591 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa4315e2-27f5-4b58-91a6-1d5b683c6e55-ssh-key\") pod \"aa4315e2-27f5-4b58-91a6-1d5b683c6e55\" (UID: \"aa4315e2-27f5-4b58-91a6-1d5b683c6e55\") " Oct 02 07:10:14 crc kubenswrapper[4786]: I1002 07:10:14.636674 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa4315e2-27f5-4b58-91a6-1d5b683c6e55-inventory\") pod \"aa4315e2-27f5-4b58-91a6-1d5b683c6e55\" (UID: \"aa4315e2-27f5-4b58-91a6-1d5b683c6e55\") " Oct 02 07:10:14 crc kubenswrapper[4786]: I1002 07:10:14.640939 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa4315e2-27f5-4b58-91a6-1d5b683c6e55-kube-api-access-ct49x" (OuterVolumeSpecName: "kube-api-access-ct49x") pod "aa4315e2-27f5-4b58-91a6-1d5b683c6e55" (UID: "aa4315e2-27f5-4b58-91a6-1d5b683c6e55"). InnerVolumeSpecName "kube-api-access-ct49x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:10:14 crc kubenswrapper[4786]: I1002 07:10:14.656976 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa4315e2-27f5-4b58-91a6-1d5b683c6e55-inventory" (OuterVolumeSpecName: "inventory") pod "aa4315e2-27f5-4b58-91a6-1d5b683c6e55" (UID: "aa4315e2-27f5-4b58-91a6-1d5b683c6e55"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:10:14 crc kubenswrapper[4786]: I1002 07:10:14.657415 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa4315e2-27f5-4b58-91a6-1d5b683c6e55-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aa4315e2-27f5-4b58-91a6-1d5b683c6e55" (UID: "aa4315e2-27f5-4b58-91a6-1d5b683c6e55"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:10:14 crc kubenswrapper[4786]: I1002 07:10:14.738362 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct49x\" (UniqueName: \"kubernetes.io/projected/aa4315e2-27f5-4b58-91a6-1d5b683c6e55-kube-api-access-ct49x\") on node \"crc\" DevicePath \"\"" Oct 02 07:10:14 crc kubenswrapper[4786]: I1002 07:10:14.738388 4786 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa4315e2-27f5-4b58-91a6-1d5b683c6e55-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 07:10:14 crc kubenswrapper[4786]: I1002 07:10:14.738397 4786 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa4315e2-27f5-4b58-91a6-1d5b683c6e55-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.192430 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.192424 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw" event={"ID":"aa4315e2-27f5-4b58-91a6-1d5b683c6e55","Type":"ContainerDied","Data":"a52cd4a75e73fbad56781813e9e393dbdaf98260a777235181a1490236563ea1"} Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.192648 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a52cd4a75e73fbad56781813e9e393dbdaf98260a777235181a1490236563ea1" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.193888 4786 generic.go:334] "Generic (PLEG): container finished" podID="cdf394fd-3ebf-409a-84b2-ad054e544706" containerID="1fac4db663641acb2d6357121d178b17082bc1d146ed0f0914a2d404d7d31f52" exitCode=0 Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.193922 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f2kc" event={"ID":"cdf394fd-3ebf-409a-84b2-ad054e544706","Type":"ContainerDied","Data":"1fac4db663641acb2d6357121d178b17082bc1d146ed0f0914a2d404d7d31f52"} Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.252592 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq"] Oct 02 07:10:15 crc kubenswrapper[4786]: E1002 07:10:15.252971 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa4315e2-27f5-4b58-91a6-1d5b683c6e55" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.252989 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa4315e2-27f5-4b58-91a6-1d5b683c6e55" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.253209 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa4315e2-27f5-4b58-91a6-1d5b683c6e55" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.253760 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.255256 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.255754 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.257101 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.258743 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hxdbp" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.260337 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq"] Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.346294 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2747d6cd-de52-43b7-a2d0-16c86deecd42-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq\" (UID: \"2747d6cd-de52-43b7-a2d0-16c86deecd42\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.346338 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-784fv\" (UniqueName: \"kubernetes.io/projected/2747d6cd-de52-43b7-a2d0-16c86deecd42-kube-api-access-784fv\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq\" (UID: \"2747d6cd-de52-43b7-a2d0-16c86deecd42\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.346424 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2747d6cd-de52-43b7-a2d0-16c86deecd42-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq\" (UID: \"2747d6cd-de52-43b7-a2d0-16c86deecd42\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.447553 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2747d6cd-de52-43b7-a2d0-16c86deecd42-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq\" (UID: \"2747d6cd-de52-43b7-a2d0-16c86deecd42\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.447594 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-784fv\" (UniqueName: \"kubernetes.io/projected/2747d6cd-de52-43b7-a2d0-16c86deecd42-kube-api-access-784fv\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq\" (UID: \"2747d6cd-de52-43b7-a2d0-16c86deecd42\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.447642 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2747d6cd-de52-43b7-a2d0-16c86deecd42-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq\" (UID: \"2747d6cd-de52-43b7-a2d0-16c86deecd42\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.450332 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2747d6cd-de52-43b7-a2d0-16c86deecd42-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq\" (UID: \"2747d6cd-de52-43b7-a2d0-16c86deecd42\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.450428 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2747d6cd-de52-43b7-a2d0-16c86deecd42-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq\" (UID: \"2747d6cd-de52-43b7-a2d0-16c86deecd42\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.459583 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-784fv\" (UniqueName: \"kubernetes.io/projected/2747d6cd-de52-43b7-a2d0-16c86deecd42-kube-api-access-784fv\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq\" (UID: \"2747d6cd-de52-43b7-a2d0-16c86deecd42\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.568079 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq" Oct 02 07:10:15 crc kubenswrapper[4786]: I1002 07:10:15.979967 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq"] Oct 02 07:10:15 crc kubenswrapper[4786]: W1002 07:10:15.980942 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2747d6cd_de52_43b7_a2d0_16c86deecd42.slice/crio-3dd805d204c8b1efc32fdb6060477cc18c9c329dfcdc8cf24792503b9b637fd6 WatchSource:0}: Error finding container 3dd805d204c8b1efc32fdb6060477cc18c9c329dfcdc8cf24792503b9b637fd6: Status 404 returned error can't find the container with id 3dd805d204c8b1efc32fdb6060477cc18c9c329dfcdc8cf24792503b9b637fd6 Oct 02 07:10:16 crc kubenswrapper[4786]: I1002 07:10:16.202173 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f2kc" event={"ID":"cdf394fd-3ebf-409a-84b2-ad054e544706","Type":"ContainerStarted","Data":"cc0b0aa4c1821873592d52543c6dfa237835005f6e7174a50d1a4894a110ce09"} Oct 02 07:10:16 crc kubenswrapper[4786]: I1002 07:10:16.203139 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq" event={"ID":"2747d6cd-de52-43b7-a2d0-16c86deecd42","Type":"ContainerStarted","Data":"3dd805d204c8b1efc32fdb6060477cc18c9c329dfcdc8cf24792503b9b637fd6"} Oct 02 07:10:16 crc kubenswrapper[4786]: I1002 07:10:16.216238 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5f2kc" podStartSLOduration=2.704804328 podStartE2EDuration="6.216225759s" podCreationTimestamp="2025-10-02 07:10:10 +0000 UTC" firstStartedPulling="2025-10-02 07:10:12.15376298 +0000 UTC m=+1422.274946112" lastFinishedPulling="2025-10-02 07:10:15.665184411 +0000 UTC m=+1425.786367543" observedRunningTime="2025-10-02 07:10:16.212815878 +0000 UTC m=+1426.333999019" watchObservedRunningTime="2025-10-02 07:10:16.216225759 +0000 UTC m=+1426.337408890" Oct 02 07:10:17 crc kubenswrapper[4786]: I1002 07:10:17.224874 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq" event={"ID":"2747d6cd-de52-43b7-a2d0-16c86deecd42","Type":"ContainerStarted","Data":"9d9ed388196c55f390d0a0bdb5202d3e6c9d7ade495bd964e7f6a13932a80331"} Oct 02 07:10:17 crc kubenswrapper[4786]: I1002 07:10:17.238898 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq" podStartSLOduration=1.593976038 podStartE2EDuration="2.238883945s" podCreationTimestamp="2025-10-02 07:10:15 +0000 UTC" firstStartedPulling="2025-10-02 07:10:15.983047143 +0000 UTC m=+1426.104230274" lastFinishedPulling="2025-10-02 07:10:16.62795505 +0000 UTC m=+1426.749138181" observedRunningTime="2025-10-02 07:10:17.237593742 +0000 UTC m=+1427.358776874" watchObservedRunningTime="2025-10-02 07:10:17.238883945 +0000 UTC m=+1427.360067076" Oct 02 07:10:20 crc kubenswrapper[4786]: I1002 07:10:20.952079 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5f2kc" Oct 02 07:10:20 crc kubenswrapper[4786]: I1002 07:10:20.952450 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5f2kc" Oct 02 07:10:20 crc kubenswrapper[4786]: I1002 07:10:20.989896 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5f2kc" Oct 02 07:10:21 crc kubenswrapper[4786]: I1002 07:10:21.282006 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5f2kc" Oct 02 07:10:21 crc kubenswrapper[4786]: I1002 07:10:21.317632 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5f2kc"] Oct 02 07:10:23 crc kubenswrapper[4786]: I1002 07:10:23.260967 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5f2kc" podUID="cdf394fd-3ebf-409a-84b2-ad054e544706" containerName="registry-server" containerID="cri-o://cc0b0aa4c1821873592d52543c6dfa237835005f6e7174a50d1a4894a110ce09" gracePeriod=2 Oct 02 07:10:23 crc kubenswrapper[4786]: I1002 07:10:23.603949 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5f2kc" Oct 02 07:10:23 crc kubenswrapper[4786]: I1002 07:10:23.764303 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdf394fd-3ebf-409a-84b2-ad054e544706-utilities\") pod \"cdf394fd-3ebf-409a-84b2-ad054e544706\" (UID: \"cdf394fd-3ebf-409a-84b2-ad054e544706\") " Oct 02 07:10:23 crc kubenswrapper[4786]: I1002 07:10:23.764401 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kph65\" (UniqueName: \"kubernetes.io/projected/cdf394fd-3ebf-409a-84b2-ad054e544706-kube-api-access-kph65\") pod \"cdf394fd-3ebf-409a-84b2-ad054e544706\" (UID: \"cdf394fd-3ebf-409a-84b2-ad054e544706\") " Oct 02 07:10:23 crc kubenswrapper[4786]: I1002 07:10:23.764466 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdf394fd-3ebf-409a-84b2-ad054e544706-catalog-content\") pod \"cdf394fd-3ebf-409a-84b2-ad054e544706\" (UID: \"cdf394fd-3ebf-409a-84b2-ad054e544706\") " Oct 02 07:10:23 crc kubenswrapper[4786]: I1002 07:10:23.764906 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdf394fd-3ebf-409a-84b2-ad054e544706-utilities" (OuterVolumeSpecName: "utilities") pod "cdf394fd-3ebf-409a-84b2-ad054e544706" (UID: "cdf394fd-3ebf-409a-84b2-ad054e544706"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:10:23 crc kubenswrapper[4786]: I1002 07:10:23.769439 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdf394fd-3ebf-409a-84b2-ad054e544706-kube-api-access-kph65" (OuterVolumeSpecName: "kube-api-access-kph65") pod "cdf394fd-3ebf-409a-84b2-ad054e544706" (UID: "cdf394fd-3ebf-409a-84b2-ad054e544706"). InnerVolumeSpecName "kube-api-access-kph65". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:10:23 crc kubenswrapper[4786]: I1002 07:10:23.823847 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdf394fd-3ebf-409a-84b2-ad054e544706-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cdf394fd-3ebf-409a-84b2-ad054e544706" (UID: "cdf394fd-3ebf-409a-84b2-ad054e544706"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:10:23 crc kubenswrapper[4786]: I1002 07:10:23.866048 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdf394fd-3ebf-409a-84b2-ad054e544706-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 07:10:23 crc kubenswrapper[4786]: I1002 07:10:23.866075 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdf394fd-3ebf-409a-84b2-ad054e544706-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 07:10:23 crc kubenswrapper[4786]: I1002 07:10:23.866084 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kph65\" (UniqueName: \"kubernetes.io/projected/cdf394fd-3ebf-409a-84b2-ad054e544706-kube-api-access-kph65\") on node \"crc\" DevicePath \"\"" Oct 02 07:10:24 crc kubenswrapper[4786]: I1002 07:10:24.268657 4786 generic.go:334] "Generic (PLEG): container finished" podID="cdf394fd-3ebf-409a-84b2-ad054e544706" containerID="cc0b0aa4c1821873592d52543c6dfa237835005f6e7174a50d1a4894a110ce09" exitCode=0 Oct 02 07:10:24 crc kubenswrapper[4786]: I1002 07:10:24.268721 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5f2kc" Oct 02 07:10:24 crc kubenswrapper[4786]: I1002 07:10:24.268720 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f2kc" event={"ID":"cdf394fd-3ebf-409a-84b2-ad054e544706","Type":"ContainerDied","Data":"cc0b0aa4c1821873592d52543c6dfa237835005f6e7174a50d1a4894a110ce09"} Oct 02 07:10:24 crc kubenswrapper[4786]: I1002 07:10:24.269025 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f2kc" event={"ID":"cdf394fd-3ebf-409a-84b2-ad054e544706","Type":"ContainerDied","Data":"0719c101cac7ed9e6ca1e8e4d5e483ec333d78924d53b170c9190192450366a7"} Oct 02 07:10:24 crc kubenswrapper[4786]: I1002 07:10:24.269042 4786 scope.go:117] "RemoveContainer" containerID="cc0b0aa4c1821873592d52543c6dfa237835005f6e7174a50d1a4894a110ce09" Oct 02 07:10:24 crc kubenswrapper[4786]: I1002 07:10:24.285706 4786 scope.go:117] "RemoveContainer" containerID="1fac4db663641acb2d6357121d178b17082bc1d146ed0f0914a2d404d7d31f52" Oct 02 07:10:24 crc kubenswrapper[4786]: I1002 07:10:24.286403 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5f2kc"] Oct 02 07:10:24 crc kubenswrapper[4786]: I1002 07:10:24.292826 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5f2kc"] Oct 02 07:10:24 crc kubenswrapper[4786]: I1002 07:10:24.305310 4786 scope.go:117] "RemoveContainer" containerID="aaad6d84ecce7c09f4d89cd2e5be3b7fabc620d23007bc314ffbee366c925e44" Oct 02 07:10:24 crc kubenswrapper[4786]: I1002 07:10:24.334157 4786 scope.go:117] "RemoveContainer" containerID="cc0b0aa4c1821873592d52543c6dfa237835005f6e7174a50d1a4894a110ce09" Oct 02 07:10:24 crc kubenswrapper[4786]: E1002 07:10:24.334542 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc0b0aa4c1821873592d52543c6dfa237835005f6e7174a50d1a4894a110ce09\": container with ID starting with cc0b0aa4c1821873592d52543c6dfa237835005f6e7174a50d1a4894a110ce09 not found: ID does not exist" containerID="cc0b0aa4c1821873592d52543c6dfa237835005f6e7174a50d1a4894a110ce09" Oct 02 07:10:24 crc kubenswrapper[4786]: I1002 07:10:24.334588 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc0b0aa4c1821873592d52543c6dfa237835005f6e7174a50d1a4894a110ce09"} err="failed to get container status \"cc0b0aa4c1821873592d52543c6dfa237835005f6e7174a50d1a4894a110ce09\": rpc error: code = NotFound desc = could not find container \"cc0b0aa4c1821873592d52543c6dfa237835005f6e7174a50d1a4894a110ce09\": container with ID starting with cc0b0aa4c1821873592d52543c6dfa237835005f6e7174a50d1a4894a110ce09 not found: ID does not exist" Oct 02 07:10:24 crc kubenswrapper[4786]: I1002 07:10:24.334615 4786 scope.go:117] "RemoveContainer" containerID="1fac4db663641acb2d6357121d178b17082bc1d146ed0f0914a2d404d7d31f52" Oct 02 07:10:24 crc kubenswrapper[4786]: E1002 07:10:24.334890 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fac4db663641acb2d6357121d178b17082bc1d146ed0f0914a2d404d7d31f52\": container with ID starting with 1fac4db663641acb2d6357121d178b17082bc1d146ed0f0914a2d404d7d31f52 not found: ID does not exist" containerID="1fac4db663641acb2d6357121d178b17082bc1d146ed0f0914a2d404d7d31f52" Oct 02 07:10:24 crc kubenswrapper[4786]: I1002 07:10:24.334921 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fac4db663641acb2d6357121d178b17082bc1d146ed0f0914a2d404d7d31f52"} err="failed to get container status \"1fac4db663641acb2d6357121d178b17082bc1d146ed0f0914a2d404d7d31f52\": rpc error: code = NotFound desc = could not find container \"1fac4db663641acb2d6357121d178b17082bc1d146ed0f0914a2d404d7d31f52\": container with ID starting with 1fac4db663641acb2d6357121d178b17082bc1d146ed0f0914a2d404d7d31f52 not found: ID does not exist" Oct 02 07:10:24 crc kubenswrapper[4786]: I1002 07:10:24.334945 4786 scope.go:117] "RemoveContainer" containerID="aaad6d84ecce7c09f4d89cd2e5be3b7fabc620d23007bc314ffbee366c925e44" Oct 02 07:10:24 crc kubenswrapper[4786]: E1002 07:10:24.335247 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaad6d84ecce7c09f4d89cd2e5be3b7fabc620d23007bc314ffbee366c925e44\": container with ID starting with aaad6d84ecce7c09f4d89cd2e5be3b7fabc620d23007bc314ffbee366c925e44 not found: ID does not exist" containerID="aaad6d84ecce7c09f4d89cd2e5be3b7fabc620d23007bc314ffbee366c925e44" Oct 02 07:10:24 crc kubenswrapper[4786]: I1002 07:10:24.335267 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaad6d84ecce7c09f4d89cd2e5be3b7fabc620d23007bc314ffbee366c925e44"} err="failed to get container status \"aaad6d84ecce7c09f4d89cd2e5be3b7fabc620d23007bc314ffbee366c925e44\": rpc error: code = NotFound desc = could not find container \"aaad6d84ecce7c09f4d89cd2e5be3b7fabc620d23007bc314ffbee366c925e44\": container with ID starting with aaad6d84ecce7c09f4d89cd2e5be3b7fabc620d23007bc314ffbee366c925e44 not found: ID does not exist" Oct 02 07:10:26 crc kubenswrapper[4786]: I1002 07:10:26.186652 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdf394fd-3ebf-409a-84b2-ad054e544706" path="/var/lib/kubelet/pods/cdf394fd-3ebf-409a-84b2-ad054e544706/volumes" Oct 02 07:10:33 crc kubenswrapper[4786]: I1002 07:10:33.027431 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-s5xkn"] Oct 02 07:10:33 crc kubenswrapper[4786]: I1002 07:10:33.033651 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-phlf9"] Oct 02 07:10:33 crc kubenswrapper[4786]: I1002 07:10:33.039767 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-7wctg"] Oct 02 07:10:33 crc kubenswrapper[4786]: I1002 07:10:33.045437 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-phlf9"] Oct 02 07:10:33 crc kubenswrapper[4786]: I1002 07:10:33.050405 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-s5xkn"] Oct 02 07:10:33 crc kubenswrapper[4786]: I1002 07:10:33.055080 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-7wctg"] Oct 02 07:10:34 crc kubenswrapper[4786]: I1002 07:10:34.187121 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2" path="/var/lib/kubelet/pods/15e2f1a7-bc49-4c0b-8231-f7daae1e9ae2/volumes" Oct 02 07:10:34 crc kubenswrapper[4786]: I1002 07:10:34.188594 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e2bd960-9783-448c-a27e-de71fc1e096d" path="/var/lib/kubelet/pods/4e2bd960-9783-448c-a27e-de71fc1e096d/volumes" Oct 02 07:10:34 crc kubenswrapper[4786]: I1002 07:10:34.189152 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4e3d221-990a-4056-ae85-ebbc7fbca4a1" path="/var/lib/kubelet/pods/a4e3d221-990a-4056-ae85-ebbc7fbca4a1/volumes" Oct 02 07:10:38 crc kubenswrapper[4786]: I1002 07:10:38.218600 4786 scope.go:117] "RemoveContainer" containerID="5513e846693cb28e50278686823b04fe1cf590f356f9d1ac063bdd87bb8923c1" Oct 02 07:10:38 crc kubenswrapper[4786]: I1002 07:10:38.235423 4786 scope.go:117] "RemoveContainer" containerID="6a8dec2b7a33f18fa0586287d623ca8664d99713e2fb2d55da33bdd48062d627" Oct 02 07:10:38 crc kubenswrapper[4786]: I1002 07:10:38.267277 4786 scope.go:117] "RemoveContainer" containerID="57e4e0e0c7ec8ca559d6754d40447456328fe97d2b5a55c0d79cb7ad60f75731" Oct 02 07:10:43 crc kubenswrapper[4786]: I1002 07:10:43.025978 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-9870-account-create-mcqqp"] Oct 02 07:10:43 crc kubenswrapper[4786]: I1002 07:10:43.037812 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5002-account-create-hczhb"] Oct 02 07:10:43 crc kubenswrapper[4786]: I1002 07:10:43.044148 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-71df-account-create-8kpnq"] Oct 02 07:10:43 crc kubenswrapper[4786]: I1002 07:10:43.049037 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-9870-account-create-mcqqp"] Oct 02 07:10:43 crc kubenswrapper[4786]: I1002 07:10:43.053968 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5002-account-create-hczhb"] Oct 02 07:10:43 crc kubenswrapper[4786]: I1002 07:10:43.059914 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-71df-account-create-8kpnq"] Oct 02 07:10:44 crc kubenswrapper[4786]: I1002 07:10:44.186797 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa4cb6ec-a541-4d78-8410-331cd7066a02" path="/var/lib/kubelet/pods/aa4cb6ec-a541-4d78-8410-331cd7066a02/volumes" Oct 02 07:10:44 crc kubenswrapper[4786]: I1002 07:10:44.187249 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee69903b-44e4-4424-b9ff-9b41438de93f" path="/var/lib/kubelet/pods/ee69903b-44e4-4424-b9ff-9b41438de93f/volumes" Oct 02 07:10:44 crc kubenswrapper[4786]: I1002 07:10:44.187671 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efc4fa25-9013-4615-867d-97e8df05d30f" path="/var/lib/kubelet/pods/efc4fa25-9013-4615-867d-97e8df05d30f/volumes" Oct 02 07:10:50 crc kubenswrapper[4786]: I1002 07:10:50.025248 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-tq8qn"] Oct 02 07:10:50 crc kubenswrapper[4786]: I1002 07:10:50.032089 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-78h8k"] Oct 02 07:10:50 crc kubenswrapper[4786]: I1002 07:10:50.037850 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-mgr56"] Oct 02 07:10:50 crc kubenswrapper[4786]: I1002 07:10:50.043608 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-78h8k"] Oct 02 07:10:50 crc kubenswrapper[4786]: I1002 07:10:50.049012 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-tq8qn"] Oct 02 07:10:50 crc kubenswrapper[4786]: I1002 07:10:50.054349 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-mgr56"] Oct 02 07:10:50 crc kubenswrapper[4786]: I1002 07:10:50.186270 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24f24c17-9405-445e-9f7b-9bb550347912" path="/var/lib/kubelet/pods/24f24c17-9405-445e-9f7b-9bb550347912/volumes" Oct 02 07:10:50 crc kubenswrapper[4786]: I1002 07:10:50.186815 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="793c01ad-7938-4d1b-913d-5bfa2e48b721" path="/var/lib/kubelet/pods/793c01ad-7938-4d1b-913d-5bfa2e48b721/volumes" Oct 02 07:10:50 crc kubenswrapper[4786]: I1002 07:10:50.187283 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d12a88ec-24f7-415c-945e-21884b53b156" path="/var/lib/kubelet/pods/d12a88ec-24f7-415c-945e-21884b53b156/volumes" Oct 02 07:10:57 crc kubenswrapper[4786]: I1002 07:10:57.497334 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:10:57 crc kubenswrapper[4786]: I1002 07:10:57.497669 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:11:04 crc kubenswrapper[4786]: I1002 07:11:04.020857 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-sz2xw"] Oct 02 07:11:04 crc kubenswrapper[4786]: I1002 07:11:04.028660 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-sz2xw"] Oct 02 07:11:04 crc kubenswrapper[4786]: I1002 07:11:04.187374 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce72c1de-6645-43e2-88bf-11f40a02d16c" path="/var/lib/kubelet/pods/ce72c1de-6645-43e2-88bf-11f40a02d16c/volumes" Oct 02 07:11:07 crc kubenswrapper[4786]: I1002 07:11:07.018779 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0a22-account-create-qrrp9"] Oct 02 07:11:07 crc kubenswrapper[4786]: I1002 07:11:07.024635 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4170-account-create-6l78c"] Oct 02 07:11:07 crc kubenswrapper[4786]: I1002 07:11:07.030247 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-149f-account-create-lxndm"] Oct 02 07:11:07 crc kubenswrapper[4786]: I1002 07:11:07.034956 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-149f-account-create-lxndm"] Oct 02 07:11:07 crc kubenswrapper[4786]: I1002 07:11:07.039433 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0a22-account-create-qrrp9"] Oct 02 07:11:07 crc kubenswrapper[4786]: I1002 07:11:07.043847 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4170-account-create-6l78c"] Oct 02 07:11:08 crc kubenswrapper[4786]: I1002 07:11:08.186311 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="468d7644-ca17-43aa-88b5-f4917044b91f" path="/var/lib/kubelet/pods/468d7644-ca17-43aa-88b5-f4917044b91f/volumes" Oct 02 07:11:08 crc kubenswrapper[4786]: I1002 07:11:08.186788 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e8a0cbf-1247-443e-b393-b4981c99f28f" path="/var/lib/kubelet/pods/7e8a0cbf-1247-443e-b393-b4981c99f28f/volumes" Oct 02 07:11:08 crc kubenswrapper[4786]: I1002 07:11:08.187239 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed6d6948-364a-4b79-ba0b-7ea0645e36e4" path="/var/lib/kubelet/pods/ed6d6948-364a-4b79-ba0b-7ea0645e36e4/volumes" Oct 02 07:11:11 crc kubenswrapper[4786]: I1002 07:11:11.019142 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-kbkxq"] Oct 02 07:11:11 crc kubenswrapper[4786]: I1002 07:11:11.024735 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-kbkxq"] Oct 02 07:11:11 crc kubenswrapper[4786]: I1002 07:11:11.571411 4786 generic.go:334] "Generic (PLEG): container finished" podID="2747d6cd-de52-43b7-a2d0-16c86deecd42" containerID="9d9ed388196c55f390d0a0bdb5202d3e6c9d7ade495bd964e7f6a13932a80331" exitCode=0 Oct 02 07:11:11 crc kubenswrapper[4786]: I1002 07:11:11.571494 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq" event={"ID":"2747d6cd-de52-43b7-a2d0-16c86deecd42","Type":"ContainerDied","Data":"9d9ed388196c55f390d0a0bdb5202d3e6c9d7ade495bd964e7f6a13932a80331"} Oct 02 07:11:12 crc kubenswrapper[4786]: I1002 07:11:12.186705 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="205062cf-def7-4b01-b1cf-7f2e1d0ef398" path="/var/lib/kubelet/pods/205062cf-def7-4b01-b1cf-7f2e1d0ef398/volumes" Oct 02 07:11:12 crc kubenswrapper[4786]: I1002 07:11:12.885786 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.061017 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2747d6cd-de52-43b7-a2d0-16c86deecd42-ssh-key\") pod \"2747d6cd-de52-43b7-a2d0-16c86deecd42\" (UID: \"2747d6cd-de52-43b7-a2d0-16c86deecd42\") " Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.061073 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-784fv\" (UniqueName: \"kubernetes.io/projected/2747d6cd-de52-43b7-a2d0-16c86deecd42-kube-api-access-784fv\") pod \"2747d6cd-de52-43b7-a2d0-16c86deecd42\" (UID: \"2747d6cd-de52-43b7-a2d0-16c86deecd42\") " Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.061189 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2747d6cd-de52-43b7-a2d0-16c86deecd42-inventory\") pod \"2747d6cd-de52-43b7-a2d0-16c86deecd42\" (UID: \"2747d6cd-de52-43b7-a2d0-16c86deecd42\") " Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.065471 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2747d6cd-de52-43b7-a2d0-16c86deecd42-kube-api-access-784fv" (OuterVolumeSpecName: "kube-api-access-784fv") pod "2747d6cd-de52-43b7-a2d0-16c86deecd42" (UID: "2747d6cd-de52-43b7-a2d0-16c86deecd42"). InnerVolumeSpecName "kube-api-access-784fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.082271 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2747d6cd-de52-43b7-a2d0-16c86deecd42-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2747d6cd-de52-43b7-a2d0-16c86deecd42" (UID: "2747d6cd-de52-43b7-a2d0-16c86deecd42"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.083428 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2747d6cd-de52-43b7-a2d0-16c86deecd42-inventory" (OuterVolumeSpecName: "inventory") pod "2747d6cd-de52-43b7-a2d0-16c86deecd42" (UID: "2747d6cd-de52-43b7-a2d0-16c86deecd42"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.162594 4786 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2747d6cd-de52-43b7-a2d0-16c86deecd42-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.162617 4786 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2747d6cd-de52-43b7-a2d0-16c86deecd42-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.162626 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-784fv\" (UniqueName: \"kubernetes.io/projected/2747d6cd-de52-43b7-a2d0-16c86deecd42-kube-api-access-784fv\") on node \"crc\" DevicePath \"\"" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.585370 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq" event={"ID":"2747d6cd-de52-43b7-a2d0-16c86deecd42","Type":"ContainerDied","Data":"3dd805d204c8b1efc32fdb6060477cc18c9c329dfcdc8cf24792503b9b637fd6"} Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.585668 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dd805d204c8b1efc32fdb6060477cc18c9c329dfcdc8cf24792503b9b637fd6" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.585407 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.634794 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8"] Oct 02 07:11:13 crc kubenswrapper[4786]: E1002 07:11:13.635354 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2747d6cd-de52-43b7-a2d0-16c86deecd42" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.635424 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2747d6cd-de52-43b7-a2d0-16c86deecd42" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 02 07:11:13 crc kubenswrapper[4786]: E1002 07:11:13.635498 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf394fd-3ebf-409a-84b2-ad054e544706" containerName="extract-utilities" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.635544 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf394fd-3ebf-409a-84b2-ad054e544706" containerName="extract-utilities" Oct 02 07:11:13 crc kubenswrapper[4786]: E1002 07:11:13.635599 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf394fd-3ebf-409a-84b2-ad054e544706" containerName="registry-server" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.635651 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf394fd-3ebf-409a-84b2-ad054e544706" containerName="registry-server" Oct 02 07:11:13 crc kubenswrapper[4786]: E1002 07:11:13.635736 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf394fd-3ebf-409a-84b2-ad054e544706" containerName="extract-content" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.635783 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf394fd-3ebf-409a-84b2-ad054e544706" containerName="extract-content" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.636005 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2747d6cd-de52-43b7-a2d0-16c86deecd42" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.636070 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdf394fd-3ebf-409a-84b2-ad054e544706" containerName="registry-server" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.636635 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.638187 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.638366 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.638569 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hxdbp" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.639532 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.640497 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8"] Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.772822 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18524406-101e-474c-853f-5674430d613f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8\" (UID: \"18524406-101e-474c-853f-5674430d613f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.772911 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18524406-101e-474c-853f-5674430d613f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8\" (UID: \"18524406-101e-474c-853f-5674430d613f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.772949 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxbc4\" (UniqueName: \"kubernetes.io/projected/18524406-101e-474c-853f-5674430d613f-kube-api-access-bxbc4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8\" (UID: \"18524406-101e-474c-853f-5674430d613f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.874379 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18524406-101e-474c-853f-5674430d613f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8\" (UID: \"18524406-101e-474c-853f-5674430d613f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.874442 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18524406-101e-474c-853f-5674430d613f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8\" (UID: \"18524406-101e-474c-853f-5674430d613f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.874468 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxbc4\" (UniqueName: \"kubernetes.io/projected/18524406-101e-474c-853f-5674430d613f-kube-api-access-bxbc4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8\" (UID: \"18524406-101e-474c-853f-5674430d613f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.878220 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18524406-101e-474c-853f-5674430d613f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8\" (UID: \"18524406-101e-474c-853f-5674430d613f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.878423 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18524406-101e-474c-853f-5674430d613f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8\" (UID: \"18524406-101e-474c-853f-5674430d613f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.888352 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxbc4\" (UniqueName: \"kubernetes.io/projected/18524406-101e-474c-853f-5674430d613f-kube-api-access-bxbc4\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8\" (UID: \"18524406-101e-474c-853f-5674430d613f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8" Oct 02 07:11:13 crc kubenswrapper[4786]: I1002 07:11:13.950547 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8" Oct 02 07:11:14 crc kubenswrapper[4786]: I1002 07:11:14.375125 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8"] Oct 02 07:11:14 crc kubenswrapper[4786]: W1002 07:11:14.377254 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18524406_101e_474c_853f_5674430d613f.slice/crio-7e064bfebb1b22c4fd325ead1c2173a87efa6c41f089415041019324693645d0 WatchSource:0}: Error finding container 7e064bfebb1b22c4fd325ead1c2173a87efa6c41f089415041019324693645d0: Status 404 returned error can't find the container with id 7e064bfebb1b22c4fd325ead1c2173a87efa6c41f089415041019324693645d0 Oct 02 07:11:14 crc kubenswrapper[4786]: I1002 07:11:14.379624 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 07:11:14 crc kubenswrapper[4786]: I1002 07:11:14.592739 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8" event={"ID":"18524406-101e-474c-853f-5674430d613f","Type":"ContainerStarted","Data":"7e064bfebb1b22c4fd325ead1c2173a87efa6c41f089415041019324693645d0"} Oct 02 07:11:15 crc kubenswrapper[4786]: I1002 07:11:15.602489 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8" event={"ID":"18524406-101e-474c-853f-5674430d613f","Type":"ContainerStarted","Data":"ed0e93254528c57163f847a79689c24162f376e7466bb274247acd8ecf872ec9"} Oct 02 07:11:15 crc kubenswrapper[4786]: I1002 07:11:15.621937 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8" podStartSLOduration=2.159561955 podStartE2EDuration="2.621924795s" podCreationTimestamp="2025-10-02 07:11:13 +0000 UTC" firstStartedPulling="2025-10-02 07:11:14.379322363 +0000 UTC m=+1484.500505494" lastFinishedPulling="2025-10-02 07:11:14.841685204 +0000 UTC m=+1484.962868334" observedRunningTime="2025-10-02 07:11:15.617144781 +0000 UTC m=+1485.738327923" watchObservedRunningTime="2025-10-02 07:11:15.621924795 +0000 UTC m=+1485.743107927" Oct 02 07:11:17 crc kubenswrapper[4786]: I1002 07:11:17.026746 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-zt9q9"] Oct 02 07:11:17 crc kubenswrapper[4786]: I1002 07:11:17.032202 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-zt9q9"] Oct 02 07:11:18 crc kubenswrapper[4786]: I1002 07:11:18.019141 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-lbkdj"] Oct 02 07:11:18 crc kubenswrapper[4786]: I1002 07:11:18.024576 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-lbkdj"] Oct 02 07:11:18 crc kubenswrapper[4786]: I1002 07:11:18.186620 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cc1379e-f398-4974-a6c1-408d344ff49c" path="/var/lib/kubelet/pods/5cc1379e-f398-4974-a6c1-408d344ff49c/volumes" Oct 02 07:11:18 crc kubenswrapper[4786]: I1002 07:11:18.187291 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46" path="/var/lib/kubelet/pods/fa7e0555-d3ef-4a1a-ad2c-7e0b8cf4fd46/volumes" Oct 02 07:11:18 crc kubenswrapper[4786]: I1002 07:11:18.621671 4786 generic.go:334] "Generic (PLEG): container finished" podID="18524406-101e-474c-853f-5674430d613f" containerID="ed0e93254528c57163f847a79689c24162f376e7466bb274247acd8ecf872ec9" exitCode=0 Oct 02 07:11:18 crc kubenswrapper[4786]: I1002 07:11:18.621723 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8" event={"ID":"18524406-101e-474c-853f-5674430d613f","Type":"ContainerDied","Data":"ed0e93254528c57163f847a79689c24162f376e7466bb274247acd8ecf872ec9"} Oct 02 07:11:19 crc kubenswrapper[4786]: I1002 07:11:19.929314 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8" Oct 02 07:11:19 crc kubenswrapper[4786]: I1002 07:11:19.969798 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18524406-101e-474c-853f-5674430d613f-inventory\") pod \"18524406-101e-474c-853f-5674430d613f\" (UID: \"18524406-101e-474c-853f-5674430d613f\") " Oct 02 07:11:19 crc kubenswrapper[4786]: I1002 07:11:19.969876 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxbc4\" (UniqueName: \"kubernetes.io/projected/18524406-101e-474c-853f-5674430d613f-kube-api-access-bxbc4\") pod \"18524406-101e-474c-853f-5674430d613f\" (UID: \"18524406-101e-474c-853f-5674430d613f\") " Oct 02 07:11:19 crc kubenswrapper[4786]: I1002 07:11:19.969994 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18524406-101e-474c-853f-5674430d613f-ssh-key\") pod \"18524406-101e-474c-853f-5674430d613f\" (UID: \"18524406-101e-474c-853f-5674430d613f\") " Oct 02 07:11:19 crc kubenswrapper[4786]: I1002 07:11:19.974591 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18524406-101e-474c-853f-5674430d613f-kube-api-access-bxbc4" (OuterVolumeSpecName: "kube-api-access-bxbc4") pod "18524406-101e-474c-853f-5674430d613f" (UID: "18524406-101e-474c-853f-5674430d613f"). InnerVolumeSpecName "kube-api-access-bxbc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:11:19 crc kubenswrapper[4786]: I1002 07:11:19.990056 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18524406-101e-474c-853f-5674430d613f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "18524406-101e-474c-853f-5674430d613f" (UID: "18524406-101e-474c-853f-5674430d613f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:11:19 crc kubenswrapper[4786]: I1002 07:11:19.992592 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18524406-101e-474c-853f-5674430d613f-inventory" (OuterVolumeSpecName: "inventory") pod "18524406-101e-474c-853f-5674430d613f" (UID: "18524406-101e-474c-853f-5674430d613f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.072020 4786 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/18524406-101e-474c-853f-5674430d613f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.072043 4786 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/18524406-101e-474c-853f-5674430d613f-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.072053 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxbc4\" (UniqueName: \"kubernetes.io/projected/18524406-101e-474c-853f-5674430d613f-kube-api-access-bxbc4\") on node \"crc\" DevicePath \"\"" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.635592 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8" event={"ID":"18524406-101e-474c-853f-5674430d613f","Type":"ContainerDied","Data":"7e064bfebb1b22c4fd325ead1c2173a87efa6c41f089415041019324693645d0"} Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.635875 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e064bfebb1b22c4fd325ead1c2173a87efa6c41f089415041019324693645d0" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.635633 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.680819 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5"] Oct 02 07:11:20 crc kubenswrapper[4786]: E1002 07:11:20.681111 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18524406-101e-474c-853f-5674430d613f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.681128 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="18524406-101e-474c-853f-5674430d613f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.681309 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="18524406-101e-474c-853f-5674430d613f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.681828 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.683336 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.683786 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hxdbp" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.684397 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.684549 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.692370 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5"] Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.782877 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4267f5c8-feae-4e65-a80a-ebb4c7003eaf-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7r8j5\" (UID: \"4267f5c8-feae-4e65-a80a-ebb4c7003eaf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.782946 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4267f5c8-feae-4e65-a80a-ebb4c7003eaf-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7r8j5\" (UID: \"4267f5c8-feae-4e65-a80a-ebb4c7003eaf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.783067 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqpsj\" (UniqueName: \"kubernetes.io/projected/4267f5c8-feae-4e65-a80a-ebb4c7003eaf-kube-api-access-vqpsj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7r8j5\" (UID: \"4267f5c8-feae-4e65-a80a-ebb4c7003eaf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.884121 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4267f5c8-feae-4e65-a80a-ebb4c7003eaf-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7r8j5\" (UID: \"4267f5c8-feae-4e65-a80a-ebb4c7003eaf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.884386 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4267f5c8-feae-4e65-a80a-ebb4c7003eaf-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7r8j5\" (UID: \"4267f5c8-feae-4e65-a80a-ebb4c7003eaf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.884520 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqpsj\" (UniqueName: \"kubernetes.io/projected/4267f5c8-feae-4e65-a80a-ebb4c7003eaf-kube-api-access-vqpsj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7r8j5\" (UID: \"4267f5c8-feae-4e65-a80a-ebb4c7003eaf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.887640 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4267f5c8-feae-4e65-a80a-ebb4c7003eaf-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7r8j5\" (UID: \"4267f5c8-feae-4e65-a80a-ebb4c7003eaf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.888026 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4267f5c8-feae-4e65-a80a-ebb4c7003eaf-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7r8j5\" (UID: \"4267f5c8-feae-4e65-a80a-ebb4c7003eaf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.897912 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqpsj\" (UniqueName: \"kubernetes.io/projected/4267f5c8-feae-4e65-a80a-ebb4c7003eaf-kube-api-access-vqpsj\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-7r8j5\" (UID: \"4267f5c8-feae-4e65-a80a-ebb4c7003eaf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5" Oct 02 07:11:20 crc kubenswrapper[4786]: I1002 07:11:20.994478 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5" Oct 02 07:11:21 crc kubenswrapper[4786]: I1002 07:11:21.433350 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5"] Oct 02 07:11:21 crc kubenswrapper[4786]: I1002 07:11:21.643080 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5" event={"ID":"4267f5c8-feae-4e65-a80a-ebb4c7003eaf","Type":"ContainerStarted","Data":"9b8cdcaa37aa13caf5389477d6137581e0656bd574c3c52ee24a027c306e8bb9"} Oct 02 07:11:22 crc kubenswrapper[4786]: I1002 07:11:22.651295 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5" event={"ID":"4267f5c8-feae-4e65-a80a-ebb4c7003eaf","Type":"ContainerStarted","Data":"b1f3ad4158f147a32f99db80c2dc1c6b3d396c57b4b53d889a202c4a44b4afea"} Oct 02 07:11:22 crc kubenswrapper[4786]: I1002 07:11:22.662348 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5" podStartSLOduration=2.129073364 podStartE2EDuration="2.662335915s" podCreationTimestamp="2025-10-02 07:11:20 +0000 UTC" firstStartedPulling="2025-10-02 07:11:21.436918117 +0000 UTC m=+1491.558101248" lastFinishedPulling="2025-10-02 07:11:21.970180668 +0000 UTC m=+1492.091363799" observedRunningTime="2025-10-02 07:11:22.661144699 +0000 UTC m=+1492.782327829" watchObservedRunningTime="2025-10-02 07:11:22.662335915 +0000 UTC m=+1492.783519046" Oct 02 07:11:24 crc kubenswrapper[4786]: I1002 07:11:24.018886 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-f42fq"] Oct 02 07:11:24 crc kubenswrapper[4786]: I1002 07:11:24.024779 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-f42fq"] Oct 02 07:11:24 crc kubenswrapper[4786]: I1002 07:11:24.187650 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca674644-5646-452d-ba2a-5ff2844f64ea" path="/var/lib/kubelet/pods/ca674644-5646-452d-ba2a-5ff2844f64ea/volumes" Oct 02 07:11:27 crc kubenswrapper[4786]: I1002 07:11:27.020417 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-xt8ms"] Oct 02 07:11:27 crc kubenswrapper[4786]: I1002 07:11:27.026100 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-xt8ms"] Oct 02 07:11:27 crc kubenswrapper[4786]: I1002 07:11:27.497702 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:11:27 crc kubenswrapper[4786]: I1002 07:11:27.497751 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:11:28 crc kubenswrapper[4786]: I1002 07:11:28.186406 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f7b97c5-b382-4fdf-bb25-b384b16eb1f6" path="/var/lib/kubelet/pods/9f7b97c5-b382-4fdf-bb25-b384b16eb1f6/volumes" Oct 02 07:11:38 crc kubenswrapper[4786]: I1002 07:11:38.351519 4786 scope.go:117] "RemoveContainer" containerID="183be18201f0cbf05314f48a2d7db651df44eef2a3a372b53ef59fe0cb41c8ff" Oct 02 07:11:38 crc kubenswrapper[4786]: I1002 07:11:38.378191 4786 scope.go:117] "RemoveContainer" containerID="ba5418438a4f7505b4c9a04aea6183e45dc55c96f799083df04b85b6faaa645f" Oct 02 07:11:38 crc kubenswrapper[4786]: I1002 07:11:38.400137 4786 scope.go:117] "RemoveContainer" containerID="639fcb46b84eea7c9c255304e2221e76af4340976ae3b17477a61eb3c132b293" Oct 02 07:11:38 crc kubenswrapper[4786]: I1002 07:11:38.433440 4786 scope.go:117] "RemoveContainer" containerID="995abe9d11b6677d42d75ca72fd216368afdf472ab80eca4fcdbd5575d43730d" Oct 02 07:11:38 crc kubenswrapper[4786]: I1002 07:11:38.470243 4786 scope.go:117] "RemoveContainer" containerID="87fd90869d9e4fbff99cf260989e28e00b27250d2b1614cc2327a287c318437b" Oct 02 07:11:38 crc kubenswrapper[4786]: I1002 07:11:38.489473 4786 scope.go:117] "RemoveContainer" containerID="3cbb17a720fe7ecc8ae4be1c6c60699767ee5ccea5cb632318c976a6fd62a279" Oct 02 07:11:38 crc kubenswrapper[4786]: I1002 07:11:38.515937 4786 scope.go:117] "RemoveContainer" containerID="f239a343a449fdf0fa68f6585f065e334559dde91bd48427401248d6a8585064" Oct 02 07:11:38 crc kubenswrapper[4786]: I1002 07:11:38.538658 4786 scope.go:117] "RemoveContainer" containerID="2180bad403aa120ca73eb4c7fc08aaad33410b1a1b15857057521c74dbd589cc" Oct 02 07:11:38 crc kubenswrapper[4786]: I1002 07:11:38.570803 4786 scope.go:117] "RemoveContainer" containerID="32a54afef2916bd13e35f5eb42c0de64c9d3e3df5caa85e9f12197c61260ed1f" Oct 02 07:11:38 crc kubenswrapper[4786]: I1002 07:11:38.590204 4786 scope.go:117] "RemoveContainer" containerID="6038acf2fde2f928a0de0cf0a88fe4d098677f8e301447053373f6194a309e6c" Oct 02 07:11:38 crc kubenswrapper[4786]: I1002 07:11:38.604241 4786 scope.go:117] "RemoveContainer" containerID="68819bd3851a2c4cd97a8f88ce5331a658e7d9c9c14dced99463f14f11a8fe90" Oct 02 07:11:38 crc kubenswrapper[4786]: I1002 07:11:38.626990 4786 scope.go:117] "RemoveContainer" containerID="58bbb8d245cc15875729b5bec0bb9dac8cfe41f891bb1f2826999da1bfab00bc" Oct 02 07:11:38 crc kubenswrapper[4786]: I1002 07:11:38.642075 4786 scope.go:117] "RemoveContainer" containerID="21914d2214c99cf7e1da2e1f961f98d5d21eecf347eabccdcf5a1bcc230c2642" Oct 02 07:11:38 crc kubenswrapper[4786]: I1002 07:11:38.656670 4786 scope.go:117] "RemoveContainer" containerID="c6bb577cad8c79305fa456f676a35cb17dd34c66a74d65a8dd754b3e250f1df5" Oct 02 07:11:38 crc kubenswrapper[4786]: I1002 07:11:38.669762 4786 scope.go:117] "RemoveContainer" containerID="6909801ba952a1df67fefdef25050a9855909b9742b17e77bcb85ab885236d9a" Oct 02 07:11:42 crc kubenswrapper[4786]: I1002 07:11:42.019660 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-g9gtn"] Oct 02 07:11:42 crc kubenswrapper[4786]: I1002 07:11:42.026181 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-g9gtn"] Oct 02 07:11:42 crc kubenswrapper[4786]: I1002 07:11:42.187167 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd" path="/var/lib/kubelet/pods/609b41c2-45b6-4b81-abe6-2ac3fe5ffdcd/volumes" Oct 02 07:11:47 crc kubenswrapper[4786]: I1002 07:11:47.811400 4786 generic.go:334] "Generic (PLEG): container finished" podID="4267f5c8-feae-4e65-a80a-ebb4c7003eaf" containerID="b1f3ad4158f147a32f99db80c2dc1c6b3d396c57b4b53d889a202c4a44b4afea" exitCode=0 Oct 02 07:11:47 crc kubenswrapper[4786]: I1002 07:11:47.811486 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5" event={"ID":"4267f5c8-feae-4e65-a80a-ebb4c7003eaf","Type":"ContainerDied","Data":"b1f3ad4158f147a32f99db80c2dc1c6b3d396c57b4b53d889a202c4a44b4afea"} Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.129679 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5" Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.208268 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4267f5c8-feae-4e65-a80a-ebb4c7003eaf-inventory\") pod \"4267f5c8-feae-4e65-a80a-ebb4c7003eaf\" (UID: \"4267f5c8-feae-4e65-a80a-ebb4c7003eaf\") " Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.208628 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqpsj\" (UniqueName: \"kubernetes.io/projected/4267f5c8-feae-4e65-a80a-ebb4c7003eaf-kube-api-access-vqpsj\") pod \"4267f5c8-feae-4e65-a80a-ebb4c7003eaf\" (UID: \"4267f5c8-feae-4e65-a80a-ebb4c7003eaf\") " Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.208811 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4267f5c8-feae-4e65-a80a-ebb4c7003eaf-ssh-key\") pod \"4267f5c8-feae-4e65-a80a-ebb4c7003eaf\" (UID: \"4267f5c8-feae-4e65-a80a-ebb4c7003eaf\") " Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.213985 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4267f5c8-feae-4e65-a80a-ebb4c7003eaf-kube-api-access-vqpsj" (OuterVolumeSpecName: "kube-api-access-vqpsj") pod "4267f5c8-feae-4e65-a80a-ebb4c7003eaf" (UID: "4267f5c8-feae-4e65-a80a-ebb4c7003eaf"). InnerVolumeSpecName "kube-api-access-vqpsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.229962 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4267f5c8-feae-4e65-a80a-ebb4c7003eaf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4267f5c8-feae-4e65-a80a-ebb4c7003eaf" (UID: "4267f5c8-feae-4e65-a80a-ebb4c7003eaf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.231494 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4267f5c8-feae-4e65-a80a-ebb4c7003eaf-inventory" (OuterVolumeSpecName: "inventory") pod "4267f5c8-feae-4e65-a80a-ebb4c7003eaf" (UID: "4267f5c8-feae-4e65-a80a-ebb4c7003eaf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.311620 4786 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4267f5c8-feae-4e65-a80a-ebb4c7003eaf-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.311646 4786 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4267f5c8-feae-4e65-a80a-ebb4c7003eaf-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.311657 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqpsj\" (UniqueName: \"kubernetes.io/projected/4267f5c8-feae-4e65-a80a-ebb4c7003eaf-kube-api-access-vqpsj\") on node \"crc\" DevicePath \"\"" Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.835744 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5" event={"ID":"4267f5c8-feae-4e65-a80a-ebb4c7003eaf","Type":"ContainerDied","Data":"9b8cdcaa37aa13caf5389477d6137581e0656bd574c3c52ee24a027c306e8bb9"} Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.835805 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b8cdcaa37aa13caf5389477d6137581e0656bd574c3c52ee24a027c306e8bb9" Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.835885 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-7r8j5" Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.904646 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g"] Oct 02 07:11:49 crc kubenswrapper[4786]: E1002 07:11:49.904965 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4267f5c8-feae-4e65-a80a-ebb4c7003eaf" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.904983 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4267f5c8-feae-4e65-a80a-ebb4c7003eaf" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.905136 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4267f5c8-feae-4e65-a80a-ebb4c7003eaf" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.906264 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g" Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.909644 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.909757 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hxdbp" Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.911819 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.915885 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g"] Oct 02 07:11:49 crc kubenswrapper[4786]: I1002 07:11:49.919466 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 07:11:50 crc kubenswrapper[4786]: I1002 07:11:50.024501 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3db3504-6495-465c-8c96-90b80bdcb97e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g\" (UID: \"d3db3504-6495-465c-8c96-90b80bdcb97e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g" Oct 02 07:11:50 crc kubenswrapper[4786]: I1002 07:11:50.024770 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-464bp\" (UniqueName: \"kubernetes.io/projected/d3db3504-6495-465c-8c96-90b80bdcb97e-kube-api-access-464bp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g\" (UID: \"d3db3504-6495-465c-8c96-90b80bdcb97e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g" Oct 02 07:11:50 crc kubenswrapper[4786]: I1002 07:11:50.024793 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3db3504-6495-465c-8c96-90b80bdcb97e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g\" (UID: \"d3db3504-6495-465c-8c96-90b80bdcb97e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g" Oct 02 07:11:50 crc kubenswrapper[4786]: I1002 07:11:50.126301 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3db3504-6495-465c-8c96-90b80bdcb97e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g\" (UID: \"d3db3504-6495-465c-8c96-90b80bdcb97e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g" Oct 02 07:11:50 crc kubenswrapper[4786]: I1002 07:11:50.126368 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-464bp\" (UniqueName: \"kubernetes.io/projected/d3db3504-6495-465c-8c96-90b80bdcb97e-kube-api-access-464bp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g\" (UID: \"d3db3504-6495-465c-8c96-90b80bdcb97e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g" Oct 02 07:11:50 crc kubenswrapper[4786]: I1002 07:11:50.126391 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3db3504-6495-465c-8c96-90b80bdcb97e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g\" (UID: \"d3db3504-6495-465c-8c96-90b80bdcb97e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g" Oct 02 07:11:50 crc kubenswrapper[4786]: I1002 07:11:50.129361 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3db3504-6495-465c-8c96-90b80bdcb97e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g\" (UID: \"d3db3504-6495-465c-8c96-90b80bdcb97e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g" Oct 02 07:11:50 crc kubenswrapper[4786]: I1002 07:11:50.129854 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3db3504-6495-465c-8c96-90b80bdcb97e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g\" (UID: \"d3db3504-6495-465c-8c96-90b80bdcb97e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g" Oct 02 07:11:50 crc kubenswrapper[4786]: I1002 07:11:50.139997 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-464bp\" (UniqueName: \"kubernetes.io/projected/d3db3504-6495-465c-8c96-90b80bdcb97e-kube-api-access-464bp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g\" (UID: \"d3db3504-6495-465c-8c96-90b80bdcb97e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g" Oct 02 07:11:50 crc kubenswrapper[4786]: I1002 07:11:50.219746 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g" Oct 02 07:11:50 crc kubenswrapper[4786]: I1002 07:11:50.632837 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g"] Oct 02 07:11:50 crc kubenswrapper[4786]: I1002 07:11:50.843422 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g" event={"ID":"d3db3504-6495-465c-8c96-90b80bdcb97e","Type":"ContainerStarted","Data":"dbabba74da35a170a30647cbab4c8698f3a7a7615b000584e3081a445529e984"} Oct 02 07:11:51 crc kubenswrapper[4786]: I1002 07:11:51.851272 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g" event={"ID":"d3db3504-6495-465c-8c96-90b80bdcb97e","Type":"ContainerStarted","Data":"0cd9c85c53fa71850cbe1de7583cdee5fb9e41f724395ce27af1e27eda1037ba"} Oct 02 07:11:51 crc kubenswrapper[4786]: I1002 07:11:51.866605 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g" podStartSLOduration=2.402300375 podStartE2EDuration="2.866591601s" podCreationTimestamp="2025-10-02 07:11:49 +0000 UTC" firstStartedPulling="2025-10-02 07:11:50.638980596 +0000 UTC m=+1520.760163728" lastFinishedPulling="2025-10-02 07:11:51.103271834 +0000 UTC m=+1521.224454954" observedRunningTime="2025-10-02 07:11:51.863039873 +0000 UTC m=+1521.984223014" watchObservedRunningTime="2025-10-02 07:11:51.866591601 +0000 UTC m=+1521.987774732" Oct 02 07:11:57 crc kubenswrapper[4786]: I1002 07:11:57.497820 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:11:57 crc kubenswrapper[4786]: I1002 07:11:57.498068 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:11:57 crc kubenswrapper[4786]: I1002 07:11:57.498113 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 07:11:57 crc kubenswrapper[4786]: I1002 07:11:57.498761 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921"} pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 07:11:57 crc kubenswrapper[4786]: I1002 07:11:57.498812 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" containerID="cri-o://0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" gracePeriod=600 Oct 02 07:11:57 crc kubenswrapper[4786]: E1002 07:11:57.618766 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:11:57 crc kubenswrapper[4786]: I1002 07:11:57.891528 4786 generic.go:334] "Generic (PLEG): container finished" podID="79cb22df-4930-4aed-9108-1056074d1000" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" exitCode=0 Oct 02 07:11:57 crc kubenswrapper[4786]: I1002 07:11:57.891568 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerDied","Data":"0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921"} Oct 02 07:11:57 crc kubenswrapper[4786]: I1002 07:11:57.891597 4786 scope.go:117] "RemoveContainer" containerID="6bed9a44163665a582af239dfca9191384ef1aa472fc58dcc47dcd9b608c85c4" Oct 02 07:11:57 crc kubenswrapper[4786]: I1002 07:11:57.892199 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:11:57 crc kubenswrapper[4786]: E1002 07:11:57.892425 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:12:03 crc kubenswrapper[4786]: I1002 07:12:03.025497 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-wssnr"] Oct 02 07:12:03 crc kubenswrapper[4786]: I1002 07:12:03.031519 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-77z8r"] Oct 02 07:12:03 crc kubenswrapper[4786]: I1002 07:12:03.036735 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-hx78d"] Oct 02 07:12:03 crc kubenswrapper[4786]: I1002 07:12:03.041281 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-wssnr"] Oct 02 07:12:03 crc kubenswrapper[4786]: I1002 07:12:03.045717 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-77z8r"] Oct 02 07:12:03 crc kubenswrapper[4786]: I1002 07:12:03.049937 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-hx78d"] Oct 02 07:12:04 crc kubenswrapper[4786]: I1002 07:12:04.191461 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae" path="/var/lib/kubelet/pods/aa2dceb4-1028-4ca1-972d-2f0b13bfc5ae/volumes" Oct 02 07:12:04 crc kubenswrapper[4786]: I1002 07:12:04.192015 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcad0a6b-e52d-463e-a961-0f081f4c0993" path="/var/lib/kubelet/pods/bcad0a6b-e52d-463e-a961-0f081f4c0993/volumes" Oct 02 07:12:04 crc kubenswrapper[4786]: I1002 07:12:04.192470 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd800405-6ab0-42b0-b1c3-b9667275a541" path="/var/lib/kubelet/pods/cd800405-6ab0-42b0-b1c3-b9667275a541/volumes" Oct 02 07:12:10 crc kubenswrapper[4786]: I1002 07:12:10.183455 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:12:10 crc kubenswrapper[4786]: E1002 07:12:10.183953 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:12:19 crc kubenswrapper[4786]: I1002 07:12:19.033250 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-c952-account-create-tw57w"] Oct 02 07:12:19 crc kubenswrapper[4786]: I1002 07:12:19.041008 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2ca4-account-create-qsdpx"] Oct 02 07:12:19 crc kubenswrapper[4786]: I1002 07:12:19.048281 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2ca4-account-create-qsdpx"] Oct 02 07:12:19 crc kubenswrapper[4786]: I1002 07:12:19.053849 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-c952-account-create-tw57w"] Oct 02 07:12:20 crc kubenswrapper[4786]: I1002 07:12:20.021000 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-87c5-account-create-ngfsv"] Oct 02 07:12:20 crc kubenswrapper[4786]: I1002 07:12:20.026145 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-87c5-account-create-ngfsv"] Oct 02 07:12:20 crc kubenswrapper[4786]: I1002 07:12:20.187598 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53e54cd6-8543-4b97-a441-80ba704bb59c" path="/var/lib/kubelet/pods/53e54cd6-8543-4b97-a441-80ba704bb59c/volumes" Oct 02 07:12:20 crc kubenswrapper[4786]: I1002 07:12:20.188126 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c26c9b38-23f6-440e-9667-a61014dc7d4b" path="/var/lib/kubelet/pods/c26c9b38-23f6-440e-9667-a61014dc7d4b/volumes" Oct 02 07:12:20 crc kubenswrapper[4786]: I1002 07:12:20.188875 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4dcd6ee-cc67-4e35-ab8b-69b792c190b7" path="/var/lib/kubelet/pods/e4dcd6ee-cc67-4e35-ab8b-69b792c190b7/volumes" Oct 02 07:12:24 crc kubenswrapper[4786]: I1002 07:12:24.179155 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:12:24 crc kubenswrapper[4786]: E1002 07:12:24.180012 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:12:35 crc kubenswrapper[4786]: I1002 07:12:35.178822 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:12:35 crc kubenswrapper[4786]: E1002 07:12:35.179952 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:12:36 crc kubenswrapper[4786]: I1002 07:12:36.137383 4786 generic.go:334] "Generic (PLEG): container finished" podID="d3db3504-6495-465c-8c96-90b80bdcb97e" containerID="0cd9c85c53fa71850cbe1de7583cdee5fb9e41f724395ce27af1e27eda1037ba" exitCode=2 Oct 02 07:12:36 crc kubenswrapper[4786]: I1002 07:12:36.137429 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g" event={"ID":"d3db3504-6495-465c-8c96-90b80bdcb97e","Type":"ContainerDied","Data":"0cd9c85c53fa71850cbe1de7583cdee5fb9e41f724395ce27af1e27eda1037ba"} Oct 02 07:12:37 crc kubenswrapper[4786]: I1002 07:12:37.441709 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g" Oct 02 07:12:37 crc kubenswrapper[4786]: I1002 07:12:37.614026 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3db3504-6495-465c-8c96-90b80bdcb97e-inventory\") pod \"d3db3504-6495-465c-8c96-90b80bdcb97e\" (UID: \"d3db3504-6495-465c-8c96-90b80bdcb97e\") " Oct 02 07:12:37 crc kubenswrapper[4786]: I1002 07:12:37.614100 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-464bp\" (UniqueName: \"kubernetes.io/projected/d3db3504-6495-465c-8c96-90b80bdcb97e-kube-api-access-464bp\") pod \"d3db3504-6495-465c-8c96-90b80bdcb97e\" (UID: \"d3db3504-6495-465c-8c96-90b80bdcb97e\") " Oct 02 07:12:37 crc kubenswrapper[4786]: I1002 07:12:37.614147 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3db3504-6495-465c-8c96-90b80bdcb97e-ssh-key\") pod \"d3db3504-6495-465c-8c96-90b80bdcb97e\" (UID: \"d3db3504-6495-465c-8c96-90b80bdcb97e\") " Oct 02 07:12:37 crc kubenswrapper[4786]: I1002 07:12:37.618575 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3db3504-6495-465c-8c96-90b80bdcb97e-kube-api-access-464bp" (OuterVolumeSpecName: "kube-api-access-464bp") pod "d3db3504-6495-465c-8c96-90b80bdcb97e" (UID: "d3db3504-6495-465c-8c96-90b80bdcb97e"). InnerVolumeSpecName "kube-api-access-464bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:12:37 crc kubenswrapper[4786]: I1002 07:12:37.633996 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3db3504-6495-465c-8c96-90b80bdcb97e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d3db3504-6495-465c-8c96-90b80bdcb97e" (UID: "d3db3504-6495-465c-8c96-90b80bdcb97e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:12:37 crc kubenswrapper[4786]: I1002 07:12:37.634518 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3db3504-6495-465c-8c96-90b80bdcb97e-inventory" (OuterVolumeSpecName: "inventory") pod "d3db3504-6495-465c-8c96-90b80bdcb97e" (UID: "d3db3504-6495-465c-8c96-90b80bdcb97e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:12:37 crc kubenswrapper[4786]: I1002 07:12:37.716216 4786 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d3db3504-6495-465c-8c96-90b80bdcb97e-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 07:12:37 crc kubenswrapper[4786]: I1002 07:12:37.716250 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-464bp\" (UniqueName: \"kubernetes.io/projected/d3db3504-6495-465c-8c96-90b80bdcb97e-kube-api-access-464bp\") on node \"crc\" DevicePath \"\"" Oct 02 07:12:37 crc kubenswrapper[4786]: I1002 07:12:37.716260 4786 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d3db3504-6495-465c-8c96-90b80bdcb97e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 07:12:38 crc kubenswrapper[4786]: I1002 07:12:38.150635 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g" event={"ID":"d3db3504-6495-465c-8c96-90b80bdcb97e","Type":"ContainerDied","Data":"dbabba74da35a170a30647cbab4c8698f3a7a7615b000584e3081a445529e984"} Oct 02 07:12:38 crc kubenswrapper[4786]: I1002 07:12:38.150669 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g" Oct 02 07:12:38 crc kubenswrapper[4786]: I1002 07:12:38.150673 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbabba74da35a170a30647cbab4c8698f3a7a7615b000584e3081a445529e984" Oct 02 07:12:38 crc kubenswrapper[4786]: I1002 07:12:38.842861 4786 scope.go:117] "RemoveContainer" containerID="0a36b977fca2b2ccc4b427af522e8f149fd31eb13cda835da5bad44c02d54b08" Oct 02 07:12:38 crc kubenswrapper[4786]: I1002 07:12:38.858211 4786 scope.go:117] "RemoveContainer" containerID="d1dfa9a79e80682af0fb489de6befcb8fa54f1427032b3bd1129640b273c7e67" Oct 02 07:12:38 crc kubenswrapper[4786]: I1002 07:12:38.888756 4786 scope.go:117] "RemoveContainer" containerID="54323187d828070b7c871ed68d3841c79e06f6a95f1610a2663bb04b632e232e" Oct 02 07:12:38 crc kubenswrapper[4786]: I1002 07:12:38.919048 4786 scope.go:117] "RemoveContainer" containerID="6b09b618b2f9da7abfa451c77cd2ec8607af56bb1db1d42cd935ae827737e605" Oct 02 07:12:38 crc kubenswrapper[4786]: I1002 07:12:38.948398 4786 scope.go:117] "RemoveContainer" containerID="27fcc957455b933c54be6675e68f272b76e9e8f157ef3e564ef0a63d116d9a74" Oct 02 07:12:38 crc kubenswrapper[4786]: I1002 07:12:38.980242 4786 scope.go:117] "RemoveContainer" containerID="1ad3d16a6c45db5808d8b0214b1ab84c4ff660a73dc5d71f6a4e0ae76e53f735" Oct 02 07:12:39 crc kubenswrapper[4786]: I1002 07:12:39.022330 4786 scope.go:117] "RemoveContainer" containerID="adcbf979f8f44f3e4eac40115e23bd364c08ee3e40122d1cea06ca203da73a9e" Oct 02 07:12:41 crc kubenswrapper[4786]: I1002 07:12:41.021282 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rgckl"] Oct 02 07:12:41 crc kubenswrapper[4786]: I1002 07:12:41.026385 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rgckl"] Oct 02 07:12:42 crc kubenswrapper[4786]: I1002 07:12:42.186998 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="311482dc-62f2-47f5-b1ea-005877d89e83" path="/var/lib/kubelet/pods/311482dc-62f2-47f5-b1ea-005877d89e83/volumes" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.018896 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp"] Oct 02 07:12:46 crc kubenswrapper[4786]: E1002 07:12:46.020201 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3db3504-6495-465c-8c96-90b80bdcb97e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.020233 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3db3504-6495-465c-8c96-90b80bdcb97e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.020462 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3db3504-6495-465c-8c96-90b80bdcb97e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.021087 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.023577 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hxdbp" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.023677 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.023727 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.024228 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.026295 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp"] Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.135772 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95rl6\" (UniqueName: \"kubernetes.io/projected/169def9f-29e2-41fb-bf34-86464f366256-kube-api-access-95rl6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp\" (UID: \"169def9f-29e2-41fb-bf34-86464f366256\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.135860 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/169def9f-29e2-41fb-bf34-86464f366256-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp\" (UID: \"169def9f-29e2-41fb-bf34-86464f366256\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.135890 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/169def9f-29e2-41fb-bf34-86464f366256-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp\" (UID: \"169def9f-29e2-41fb-bf34-86464f366256\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.237589 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95rl6\" (UniqueName: \"kubernetes.io/projected/169def9f-29e2-41fb-bf34-86464f366256-kube-api-access-95rl6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp\" (UID: \"169def9f-29e2-41fb-bf34-86464f366256\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.237645 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/169def9f-29e2-41fb-bf34-86464f366256-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp\" (UID: \"169def9f-29e2-41fb-bf34-86464f366256\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.237672 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/169def9f-29e2-41fb-bf34-86464f366256-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp\" (UID: \"169def9f-29e2-41fb-bf34-86464f366256\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.242152 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/169def9f-29e2-41fb-bf34-86464f366256-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp\" (UID: \"169def9f-29e2-41fb-bf34-86464f366256\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.243050 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/169def9f-29e2-41fb-bf34-86464f366256-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp\" (UID: \"169def9f-29e2-41fb-bf34-86464f366256\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.250913 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95rl6\" (UniqueName: \"kubernetes.io/projected/169def9f-29e2-41fb-bf34-86464f366256-kube-api-access-95rl6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp\" (UID: \"169def9f-29e2-41fb-bf34-86464f366256\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.335209 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.690965 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bgbgv"] Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.692708 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bgbgv" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.707993 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bgbgv"] Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.797099 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp"] Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.845507 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whdjg\" (UniqueName: \"kubernetes.io/projected/d7b8f881-bb98-4622-8c6a-59bc7f55c342-kube-api-access-whdjg\") pod \"community-operators-bgbgv\" (UID: \"d7b8f881-bb98-4622-8c6a-59bc7f55c342\") " pod="openshift-marketplace/community-operators-bgbgv" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.845907 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7b8f881-bb98-4622-8c6a-59bc7f55c342-catalog-content\") pod \"community-operators-bgbgv\" (UID: \"d7b8f881-bb98-4622-8c6a-59bc7f55c342\") " pod="openshift-marketplace/community-operators-bgbgv" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.845956 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7b8f881-bb98-4622-8c6a-59bc7f55c342-utilities\") pod \"community-operators-bgbgv\" (UID: \"d7b8f881-bb98-4622-8c6a-59bc7f55c342\") " pod="openshift-marketplace/community-operators-bgbgv" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.947231 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7b8f881-bb98-4622-8c6a-59bc7f55c342-catalog-content\") pod \"community-operators-bgbgv\" (UID: \"d7b8f881-bb98-4622-8c6a-59bc7f55c342\") " pod="openshift-marketplace/community-operators-bgbgv" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.947293 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7b8f881-bb98-4622-8c6a-59bc7f55c342-utilities\") pod \"community-operators-bgbgv\" (UID: \"d7b8f881-bb98-4622-8c6a-59bc7f55c342\") " pod="openshift-marketplace/community-operators-bgbgv" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.947345 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whdjg\" (UniqueName: \"kubernetes.io/projected/d7b8f881-bb98-4622-8c6a-59bc7f55c342-kube-api-access-whdjg\") pod \"community-operators-bgbgv\" (UID: \"d7b8f881-bb98-4622-8c6a-59bc7f55c342\") " pod="openshift-marketplace/community-operators-bgbgv" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.947655 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7b8f881-bb98-4622-8c6a-59bc7f55c342-catalog-content\") pod \"community-operators-bgbgv\" (UID: \"d7b8f881-bb98-4622-8c6a-59bc7f55c342\") " pod="openshift-marketplace/community-operators-bgbgv" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.947785 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7b8f881-bb98-4622-8c6a-59bc7f55c342-utilities\") pod \"community-operators-bgbgv\" (UID: \"d7b8f881-bb98-4622-8c6a-59bc7f55c342\") " pod="openshift-marketplace/community-operators-bgbgv" Oct 02 07:12:46 crc kubenswrapper[4786]: I1002 07:12:46.963778 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whdjg\" (UniqueName: \"kubernetes.io/projected/d7b8f881-bb98-4622-8c6a-59bc7f55c342-kube-api-access-whdjg\") pod \"community-operators-bgbgv\" (UID: \"d7b8f881-bb98-4622-8c6a-59bc7f55c342\") " pod="openshift-marketplace/community-operators-bgbgv" Oct 02 07:12:47 crc kubenswrapper[4786]: I1002 07:12:47.011528 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bgbgv" Oct 02 07:12:47 crc kubenswrapper[4786]: I1002 07:12:47.212317 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp" event={"ID":"169def9f-29e2-41fb-bf34-86464f366256","Type":"ContainerStarted","Data":"50c6e90e427e91dd58cae2556321b1e517776e559a5c2081796f30e75d991113"} Oct 02 07:12:47 crc kubenswrapper[4786]: I1002 07:12:47.375360 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bgbgv"] Oct 02 07:12:47 crc kubenswrapper[4786]: W1002 07:12:47.381107 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7b8f881_bb98_4622_8c6a_59bc7f55c342.slice/crio-bd75955d72471f546555d30bb694a3b5afae96f433f311a5ece611dfd0fbec62 WatchSource:0}: Error finding container bd75955d72471f546555d30bb694a3b5afae96f433f311a5ece611dfd0fbec62: Status 404 returned error can't find the container with id bd75955d72471f546555d30bb694a3b5afae96f433f311a5ece611dfd0fbec62 Oct 02 07:12:48 crc kubenswrapper[4786]: I1002 07:12:48.183680 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:12:48 crc kubenswrapper[4786]: E1002 07:12:48.184048 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:12:48 crc kubenswrapper[4786]: I1002 07:12:48.219826 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp" event={"ID":"169def9f-29e2-41fb-bf34-86464f366256","Type":"ContainerStarted","Data":"b8813c885f87b90e15bb3407b9b8d600e8472291bf9e6ff6c747f84c93c14e69"} Oct 02 07:12:48 crc kubenswrapper[4786]: I1002 07:12:48.221783 4786 generic.go:334] "Generic (PLEG): container finished" podID="d7b8f881-bb98-4622-8c6a-59bc7f55c342" containerID="67ff9cb22ffd54eea2f28554fade46f0d70fae156d6b4a8fb5524c72deb46132" exitCode=0 Oct 02 07:12:48 crc kubenswrapper[4786]: I1002 07:12:48.221812 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgbgv" event={"ID":"d7b8f881-bb98-4622-8c6a-59bc7f55c342","Type":"ContainerDied","Data":"67ff9cb22ffd54eea2f28554fade46f0d70fae156d6b4a8fb5524c72deb46132"} Oct 02 07:12:48 crc kubenswrapper[4786]: I1002 07:12:48.221829 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgbgv" event={"ID":"d7b8f881-bb98-4622-8c6a-59bc7f55c342","Type":"ContainerStarted","Data":"bd75955d72471f546555d30bb694a3b5afae96f433f311a5ece611dfd0fbec62"} Oct 02 07:12:48 crc kubenswrapper[4786]: I1002 07:12:48.248653 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp" podStartSLOduration=1.795856406 podStartE2EDuration="2.248636376s" podCreationTimestamp="2025-10-02 07:12:46 +0000 UTC" firstStartedPulling="2025-10-02 07:12:46.801425875 +0000 UTC m=+1576.922609006" lastFinishedPulling="2025-10-02 07:12:47.254205845 +0000 UTC m=+1577.375388976" observedRunningTime="2025-10-02 07:12:48.244007167 +0000 UTC m=+1578.365190308" watchObservedRunningTime="2025-10-02 07:12:48.248636376 +0000 UTC m=+1578.369819508" Oct 02 07:12:49 crc kubenswrapper[4786]: I1002 07:12:49.230668 4786 generic.go:334] "Generic (PLEG): container finished" podID="d7b8f881-bb98-4622-8c6a-59bc7f55c342" containerID="57515650f0a4335ff708464a43b65761a019a9f29a98c606ce918762657cdc7b" exitCode=0 Oct 02 07:12:49 crc kubenswrapper[4786]: I1002 07:12:49.230722 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgbgv" event={"ID":"d7b8f881-bb98-4622-8c6a-59bc7f55c342","Type":"ContainerDied","Data":"57515650f0a4335ff708464a43b65761a019a9f29a98c606ce918762657cdc7b"} Oct 02 07:12:50 crc kubenswrapper[4786]: I1002 07:12:50.238746 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgbgv" event={"ID":"d7b8f881-bb98-4622-8c6a-59bc7f55c342","Type":"ContainerStarted","Data":"faa6bd7c74c5805046ed90571e530277cd730151071f78d3945605ffc93e34e1"} Oct 02 07:12:50 crc kubenswrapper[4786]: I1002 07:12:50.252200 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bgbgv" podStartSLOduration=2.591417623 podStartE2EDuration="4.252185189s" podCreationTimestamp="2025-10-02 07:12:46 +0000 UTC" firstStartedPulling="2025-10-02 07:12:48.223373286 +0000 UTC m=+1578.344556418" lastFinishedPulling="2025-10-02 07:12:49.884140853 +0000 UTC m=+1580.005323984" observedRunningTime="2025-10-02 07:12:50.249800983 +0000 UTC m=+1580.370984114" watchObservedRunningTime="2025-10-02 07:12:50.252185189 +0000 UTC m=+1580.373368320" Oct 02 07:12:56 crc kubenswrapper[4786]: I1002 07:12:56.307016 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qd9qj"] Oct 02 07:12:56 crc kubenswrapper[4786]: I1002 07:12:56.308976 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qd9qj" Oct 02 07:12:56 crc kubenswrapper[4786]: I1002 07:12:56.316662 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qd9qj"] Oct 02 07:12:56 crc kubenswrapper[4786]: I1002 07:12:56.397298 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/869ccf91-e954-489b-9457-eddd0f68016a-catalog-content\") pod \"redhat-marketplace-qd9qj\" (UID: \"869ccf91-e954-489b-9457-eddd0f68016a\") " pod="openshift-marketplace/redhat-marketplace-qd9qj" Oct 02 07:12:56 crc kubenswrapper[4786]: I1002 07:12:56.397420 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65x6p\" (UniqueName: \"kubernetes.io/projected/869ccf91-e954-489b-9457-eddd0f68016a-kube-api-access-65x6p\") pod \"redhat-marketplace-qd9qj\" (UID: \"869ccf91-e954-489b-9457-eddd0f68016a\") " pod="openshift-marketplace/redhat-marketplace-qd9qj" Oct 02 07:12:56 crc kubenswrapper[4786]: I1002 07:12:56.397520 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/869ccf91-e954-489b-9457-eddd0f68016a-utilities\") pod \"redhat-marketplace-qd9qj\" (UID: \"869ccf91-e954-489b-9457-eddd0f68016a\") " pod="openshift-marketplace/redhat-marketplace-qd9qj" Oct 02 07:12:56 crc kubenswrapper[4786]: I1002 07:12:56.498639 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/869ccf91-e954-489b-9457-eddd0f68016a-utilities\") pod \"redhat-marketplace-qd9qj\" (UID: \"869ccf91-e954-489b-9457-eddd0f68016a\") " pod="openshift-marketplace/redhat-marketplace-qd9qj" Oct 02 07:12:56 crc kubenswrapper[4786]: I1002 07:12:56.498750 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/869ccf91-e954-489b-9457-eddd0f68016a-catalog-content\") pod \"redhat-marketplace-qd9qj\" (UID: \"869ccf91-e954-489b-9457-eddd0f68016a\") " pod="openshift-marketplace/redhat-marketplace-qd9qj" Oct 02 07:12:56 crc kubenswrapper[4786]: I1002 07:12:56.498830 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65x6p\" (UniqueName: \"kubernetes.io/projected/869ccf91-e954-489b-9457-eddd0f68016a-kube-api-access-65x6p\") pod \"redhat-marketplace-qd9qj\" (UID: \"869ccf91-e954-489b-9457-eddd0f68016a\") " pod="openshift-marketplace/redhat-marketplace-qd9qj" Oct 02 07:12:56 crc kubenswrapper[4786]: I1002 07:12:56.499035 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/869ccf91-e954-489b-9457-eddd0f68016a-utilities\") pod \"redhat-marketplace-qd9qj\" (UID: \"869ccf91-e954-489b-9457-eddd0f68016a\") " pod="openshift-marketplace/redhat-marketplace-qd9qj" Oct 02 07:12:56 crc kubenswrapper[4786]: I1002 07:12:56.499116 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/869ccf91-e954-489b-9457-eddd0f68016a-catalog-content\") pod \"redhat-marketplace-qd9qj\" (UID: \"869ccf91-e954-489b-9457-eddd0f68016a\") " pod="openshift-marketplace/redhat-marketplace-qd9qj" Oct 02 07:12:56 crc kubenswrapper[4786]: I1002 07:12:56.513287 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65x6p\" (UniqueName: \"kubernetes.io/projected/869ccf91-e954-489b-9457-eddd0f68016a-kube-api-access-65x6p\") pod \"redhat-marketplace-qd9qj\" (UID: \"869ccf91-e954-489b-9457-eddd0f68016a\") " pod="openshift-marketplace/redhat-marketplace-qd9qj" Oct 02 07:12:56 crc kubenswrapper[4786]: I1002 07:12:56.626202 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qd9qj" Oct 02 07:12:56 crc kubenswrapper[4786]: I1002 07:12:56.991463 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qd9qj"] Oct 02 07:12:57 crc kubenswrapper[4786]: I1002 07:12:57.012031 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bgbgv" Oct 02 07:12:57 crc kubenswrapper[4786]: I1002 07:12:57.012090 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bgbgv" Oct 02 07:12:57 crc kubenswrapper[4786]: I1002 07:12:57.044371 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bgbgv" Oct 02 07:12:57 crc kubenswrapper[4786]: I1002 07:12:57.285338 4786 generic.go:334] "Generic (PLEG): container finished" podID="869ccf91-e954-489b-9457-eddd0f68016a" containerID="b7e52d94a6085fcad4025da4f90074e3585d089f1f2f54579054657acc886631" exitCode=0 Oct 02 07:12:57 crc kubenswrapper[4786]: I1002 07:12:57.285416 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qd9qj" event={"ID":"869ccf91-e954-489b-9457-eddd0f68016a","Type":"ContainerDied","Data":"b7e52d94a6085fcad4025da4f90074e3585d089f1f2f54579054657acc886631"} Oct 02 07:12:57 crc kubenswrapper[4786]: I1002 07:12:57.285576 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qd9qj" event={"ID":"869ccf91-e954-489b-9457-eddd0f68016a","Type":"ContainerStarted","Data":"6612971bec56b844cb1d5d778b8ba1d7f2fc63f29cd27576d9b23ce2c2ff9b67"} Oct 02 07:12:57 crc kubenswrapper[4786]: I1002 07:12:57.317393 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bgbgv" Oct 02 07:12:58 crc kubenswrapper[4786]: I1002 07:12:58.293626 4786 generic.go:334] "Generic (PLEG): container finished" podID="869ccf91-e954-489b-9457-eddd0f68016a" containerID="e3a5b076c74bd95e2f3fd158d17fcdf48d8cdf0a4d540edfd921d7a38e6622b2" exitCode=0 Oct 02 07:12:58 crc kubenswrapper[4786]: I1002 07:12:58.293724 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qd9qj" event={"ID":"869ccf91-e954-489b-9457-eddd0f68016a","Type":"ContainerDied","Data":"e3a5b076c74bd95e2f3fd158d17fcdf48d8cdf0a4d540edfd921d7a38e6622b2"} Oct 02 07:12:59 crc kubenswrapper[4786]: I1002 07:12:59.293017 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bgbgv"] Oct 02 07:12:59 crc kubenswrapper[4786]: I1002 07:12:59.302506 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qd9qj" event={"ID":"869ccf91-e954-489b-9457-eddd0f68016a","Type":"ContainerStarted","Data":"2ed09b820fbe70001125d56d2b3c3d5b1c52a9da16db796df58c0fd96e2a6632"} Oct 02 07:12:59 crc kubenswrapper[4786]: I1002 07:12:59.302632 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bgbgv" podUID="d7b8f881-bb98-4622-8c6a-59bc7f55c342" containerName="registry-server" containerID="cri-o://faa6bd7c74c5805046ed90571e530277cd730151071f78d3945605ffc93e34e1" gracePeriod=2 Oct 02 07:12:59 crc kubenswrapper[4786]: I1002 07:12:59.316615 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qd9qj" podStartSLOduration=1.828374276 podStartE2EDuration="3.316600516s" podCreationTimestamp="2025-10-02 07:12:56 +0000 UTC" firstStartedPulling="2025-10-02 07:12:57.286623376 +0000 UTC m=+1587.407806507" lastFinishedPulling="2025-10-02 07:12:58.774849616 +0000 UTC m=+1588.896032747" observedRunningTime="2025-10-02 07:12:59.315291157 +0000 UTC m=+1589.436474298" watchObservedRunningTime="2025-10-02 07:12:59.316600516 +0000 UTC m=+1589.437783647" Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.032967 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-crkmc"] Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.039400 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-crkmc"] Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.168137 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bgbgv" Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.194408 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="478c626d-63fa-4342-b4b7-59d09c6ce3c1" path="/var/lib/kubelet/pods/478c626d-63fa-4342-b4b7-59d09c6ce3c1/volumes" Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.264065 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7b8f881-bb98-4622-8c6a-59bc7f55c342-catalog-content\") pod \"d7b8f881-bb98-4622-8c6a-59bc7f55c342\" (UID: \"d7b8f881-bb98-4622-8c6a-59bc7f55c342\") " Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.264168 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whdjg\" (UniqueName: \"kubernetes.io/projected/d7b8f881-bb98-4622-8c6a-59bc7f55c342-kube-api-access-whdjg\") pod \"d7b8f881-bb98-4622-8c6a-59bc7f55c342\" (UID: \"d7b8f881-bb98-4622-8c6a-59bc7f55c342\") " Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.264289 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7b8f881-bb98-4622-8c6a-59bc7f55c342-utilities\") pod \"d7b8f881-bb98-4622-8c6a-59bc7f55c342\" (UID: \"d7b8f881-bb98-4622-8c6a-59bc7f55c342\") " Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.264844 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7b8f881-bb98-4622-8c6a-59bc7f55c342-utilities" (OuterVolumeSpecName: "utilities") pod "d7b8f881-bb98-4622-8c6a-59bc7f55c342" (UID: "d7b8f881-bb98-4622-8c6a-59bc7f55c342"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.265646 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7b8f881-bb98-4622-8c6a-59bc7f55c342-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.268483 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7b8f881-bb98-4622-8c6a-59bc7f55c342-kube-api-access-whdjg" (OuterVolumeSpecName: "kube-api-access-whdjg") pod "d7b8f881-bb98-4622-8c6a-59bc7f55c342" (UID: "d7b8f881-bb98-4622-8c6a-59bc7f55c342"). InnerVolumeSpecName "kube-api-access-whdjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.311029 4786 generic.go:334] "Generic (PLEG): container finished" podID="d7b8f881-bb98-4622-8c6a-59bc7f55c342" containerID="faa6bd7c74c5805046ed90571e530277cd730151071f78d3945605ffc93e34e1" exitCode=0 Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.312070 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bgbgv" Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.312468 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgbgv" event={"ID":"d7b8f881-bb98-4622-8c6a-59bc7f55c342","Type":"ContainerDied","Data":"faa6bd7c74c5805046ed90571e530277cd730151071f78d3945605ffc93e34e1"} Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.312521 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bgbgv" event={"ID":"d7b8f881-bb98-4622-8c6a-59bc7f55c342","Type":"ContainerDied","Data":"bd75955d72471f546555d30bb694a3b5afae96f433f311a5ece611dfd0fbec62"} Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.312551 4786 scope.go:117] "RemoveContainer" containerID="faa6bd7c74c5805046ed90571e530277cd730151071f78d3945605ffc93e34e1" Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.316061 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7b8f881-bb98-4622-8c6a-59bc7f55c342-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7b8f881-bb98-4622-8c6a-59bc7f55c342" (UID: "d7b8f881-bb98-4622-8c6a-59bc7f55c342"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.332835 4786 scope.go:117] "RemoveContainer" containerID="57515650f0a4335ff708464a43b65761a019a9f29a98c606ce918762657cdc7b" Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.347485 4786 scope.go:117] "RemoveContainer" containerID="67ff9cb22ffd54eea2f28554fade46f0d70fae156d6b4a8fb5524c72deb46132" Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.367069 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7b8f881-bb98-4622-8c6a-59bc7f55c342-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.367092 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whdjg\" (UniqueName: \"kubernetes.io/projected/d7b8f881-bb98-4622-8c6a-59bc7f55c342-kube-api-access-whdjg\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.378262 4786 scope.go:117] "RemoveContainer" containerID="faa6bd7c74c5805046ed90571e530277cd730151071f78d3945605ffc93e34e1" Oct 02 07:13:00 crc kubenswrapper[4786]: E1002 07:13:00.378543 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faa6bd7c74c5805046ed90571e530277cd730151071f78d3945605ffc93e34e1\": container with ID starting with faa6bd7c74c5805046ed90571e530277cd730151071f78d3945605ffc93e34e1 not found: ID does not exist" containerID="faa6bd7c74c5805046ed90571e530277cd730151071f78d3945605ffc93e34e1" Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.378569 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faa6bd7c74c5805046ed90571e530277cd730151071f78d3945605ffc93e34e1"} err="failed to get container status \"faa6bd7c74c5805046ed90571e530277cd730151071f78d3945605ffc93e34e1\": rpc error: code = NotFound desc = could not find container \"faa6bd7c74c5805046ed90571e530277cd730151071f78d3945605ffc93e34e1\": container with ID starting with faa6bd7c74c5805046ed90571e530277cd730151071f78d3945605ffc93e34e1 not found: ID does not exist" Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.378589 4786 scope.go:117] "RemoveContainer" containerID="57515650f0a4335ff708464a43b65761a019a9f29a98c606ce918762657cdc7b" Oct 02 07:13:00 crc kubenswrapper[4786]: E1002 07:13:00.379000 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57515650f0a4335ff708464a43b65761a019a9f29a98c606ce918762657cdc7b\": container with ID starting with 57515650f0a4335ff708464a43b65761a019a9f29a98c606ce918762657cdc7b not found: ID does not exist" containerID="57515650f0a4335ff708464a43b65761a019a9f29a98c606ce918762657cdc7b" Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.379020 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57515650f0a4335ff708464a43b65761a019a9f29a98c606ce918762657cdc7b"} err="failed to get container status \"57515650f0a4335ff708464a43b65761a019a9f29a98c606ce918762657cdc7b\": rpc error: code = NotFound desc = could not find container \"57515650f0a4335ff708464a43b65761a019a9f29a98c606ce918762657cdc7b\": container with ID starting with 57515650f0a4335ff708464a43b65761a019a9f29a98c606ce918762657cdc7b not found: ID does not exist" Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.379033 4786 scope.go:117] "RemoveContainer" containerID="67ff9cb22ffd54eea2f28554fade46f0d70fae156d6b4a8fb5524c72deb46132" Oct 02 07:13:00 crc kubenswrapper[4786]: E1002 07:13:00.379328 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67ff9cb22ffd54eea2f28554fade46f0d70fae156d6b4a8fb5524c72deb46132\": container with ID starting with 67ff9cb22ffd54eea2f28554fade46f0d70fae156d6b4a8fb5524c72deb46132 not found: ID does not exist" containerID="67ff9cb22ffd54eea2f28554fade46f0d70fae156d6b4a8fb5524c72deb46132" Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.379373 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67ff9cb22ffd54eea2f28554fade46f0d70fae156d6b4a8fb5524c72deb46132"} err="failed to get container status \"67ff9cb22ffd54eea2f28554fade46f0d70fae156d6b4a8fb5524c72deb46132\": rpc error: code = NotFound desc = could not find container \"67ff9cb22ffd54eea2f28554fade46f0d70fae156d6b4a8fb5524c72deb46132\": container with ID starting with 67ff9cb22ffd54eea2f28554fade46f0d70fae156d6b4a8fb5524c72deb46132 not found: ID does not exist" Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.634331 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bgbgv"] Oct 02 07:13:00 crc kubenswrapper[4786]: I1002 07:13:00.640827 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bgbgv"] Oct 02 07:13:01 crc kubenswrapper[4786]: I1002 07:13:01.017318 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-nd9rf"] Oct 02 07:13:01 crc kubenswrapper[4786]: I1002 07:13:01.022799 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-nd9rf"] Oct 02 07:13:01 crc kubenswrapper[4786]: I1002 07:13:01.179238 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:13:01 crc kubenswrapper[4786]: E1002 07:13:01.179534 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:13:02 crc kubenswrapper[4786]: I1002 07:13:02.186965 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af71207-973b-401b-bba1-e78a23a043b5" path="/var/lib/kubelet/pods/2af71207-973b-401b-bba1-e78a23a043b5/volumes" Oct 02 07:13:02 crc kubenswrapper[4786]: I1002 07:13:02.187548 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7b8f881-bb98-4622-8c6a-59bc7f55c342" path="/var/lib/kubelet/pods/d7b8f881-bb98-4622-8c6a-59bc7f55c342/volumes" Oct 02 07:13:06 crc kubenswrapper[4786]: I1002 07:13:06.626656 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qd9qj" Oct 02 07:13:06 crc kubenswrapper[4786]: I1002 07:13:06.627006 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qd9qj" Oct 02 07:13:06 crc kubenswrapper[4786]: I1002 07:13:06.656792 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qd9qj" Oct 02 07:13:07 crc kubenswrapper[4786]: I1002 07:13:07.383506 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qd9qj" Oct 02 07:13:07 crc kubenswrapper[4786]: I1002 07:13:07.420242 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qd9qj"] Oct 02 07:13:09 crc kubenswrapper[4786]: I1002 07:13:09.363320 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qd9qj" podUID="869ccf91-e954-489b-9457-eddd0f68016a" containerName="registry-server" containerID="cri-o://2ed09b820fbe70001125d56d2b3c3d5b1c52a9da16db796df58c0fd96e2a6632" gracePeriod=2 Oct 02 07:13:09 crc kubenswrapper[4786]: I1002 07:13:09.699842 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qd9qj" Oct 02 07:13:09 crc kubenswrapper[4786]: I1002 07:13:09.897055 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65x6p\" (UniqueName: \"kubernetes.io/projected/869ccf91-e954-489b-9457-eddd0f68016a-kube-api-access-65x6p\") pod \"869ccf91-e954-489b-9457-eddd0f68016a\" (UID: \"869ccf91-e954-489b-9457-eddd0f68016a\") " Oct 02 07:13:09 crc kubenswrapper[4786]: I1002 07:13:09.897197 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/869ccf91-e954-489b-9457-eddd0f68016a-utilities\") pod \"869ccf91-e954-489b-9457-eddd0f68016a\" (UID: \"869ccf91-e954-489b-9457-eddd0f68016a\") " Oct 02 07:13:09 crc kubenswrapper[4786]: I1002 07:13:09.897220 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/869ccf91-e954-489b-9457-eddd0f68016a-catalog-content\") pod \"869ccf91-e954-489b-9457-eddd0f68016a\" (UID: \"869ccf91-e954-489b-9457-eddd0f68016a\") " Oct 02 07:13:09 crc kubenswrapper[4786]: I1002 07:13:09.898212 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/869ccf91-e954-489b-9457-eddd0f68016a-utilities" (OuterVolumeSpecName: "utilities") pod "869ccf91-e954-489b-9457-eddd0f68016a" (UID: "869ccf91-e954-489b-9457-eddd0f68016a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:13:09 crc kubenswrapper[4786]: I1002 07:13:09.901871 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/869ccf91-e954-489b-9457-eddd0f68016a-kube-api-access-65x6p" (OuterVolumeSpecName: "kube-api-access-65x6p") pod "869ccf91-e954-489b-9457-eddd0f68016a" (UID: "869ccf91-e954-489b-9457-eddd0f68016a"). InnerVolumeSpecName "kube-api-access-65x6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:13:09 crc kubenswrapper[4786]: I1002 07:13:09.905991 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/869ccf91-e954-489b-9457-eddd0f68016a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "869ccf91-e954-489b-9457-eddd0f68016a" (UID: "869ccf91-e954-489b-9457-eddd0f68016a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:13:09 crc kubenswrapper[4786]: I1002 07:13:09.999260 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65x6p\" (UniqueName: \"kubernetes.io/projected/869ccf91-e954-489b-9457-eddd0f68016a-kube-api-access-65x6p\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:09 crc kubenswrapper[4786]: I1002 07:13:09.999300 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/869ccf91-e954-489b-9457-eddd0f68016a-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:09 crc kubenswrapper[4786]: I1002 07:13:09.999310 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/869ccf91-e954-489b-9457-eddd0f68016a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:10 crc kubenswrapper[4786]: I1002 07:13:10.371278 4786 generic.go:334] "Generic (PLEG): container finished" podID="869ccf91-e954-489b-9457-eddd0f68016a" containerID="2ed09b820fbe70001125d56d2b3c3d5b1c52a9da16db796df58c0fd96e2a6632" exitCode=0 Oct 02 07:13:10 crc kubenswrapper[4786]: I1002 07:13:10.371331 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qd9qj" event={"ID":"869ccf91-e954-489b-9457-eddd0f68016a","Type":"ContainerDied","Data":"2ed09b820fbe70001125d56d2b3c3d5b1c52a9da16db796df58c0fd96e2a6632"} Oct 02 07:13:10 crc kubenswrapper[4786]: I1002 07:13:10.371356 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qd9qj" event={"ID":"869ccf91-e954-489b-9457-eddd0f68016a","Type":"ContainerDied","Data":"6612971bec56b844cb1d5d778b8ba1d7f2fc63f29cd27576d9b23ce2c2ff9b67"} Oct 02 07:13:10 crc kubenswrapper[4786]: I1002 07:13:10.371373 4786 scope.go:117] "RemoveContainer" containerID="2ed09b820fbe70001125d56d2b3c3d5b1c52a9da16db796df58c0fd96e2a6632" Oct 02 07:13:10 crc kubenswrapper[4786]: I1002 07:13:10.371434 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qd9qj" Oct 02 07:13:10 crc kubenswrapper[4786]: I1002 07:13:10.386993 4786 scope.go:117] "RemoveContainer" containerID="e3a5b076c74bd95e2f3fd158d17fcdf48d8cdf0a4d540edfd921d7a38e6622b2" Oct 02 07:13:10 crc kubenswrapper[4786]: I1002 07:13:10.387123 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qd9qj"] Oct 02 07:13:10 crc kubenswrapper[4786]: I1002 07:13:10.392799 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qd9qj"] Oct 02 07:13:10 crc kubenswrapper[4786]: I1002 07:13:10.404062 4786 scope.go:117] "RemoveContainer" containerID="b7e52d94a6085fcad4025da4f90074e3585d089f1f2f54579054657acc886631" Oct 02 07:13:10 crc kubenswrapper[4786]: I1002 07:13:10.433144 4786 scope.go:117] "RemoveContainer" containerID="2ed09b820fbe70001125d56d2b3c3d5b1c52a9da16db796df58c0fd96e2a6632" Oct 02 07:13:10 crc kubenswrapper[4786]: E1002 07:13:10.433508 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ed09b820fbe70001125d56d2b3c3d5b1c52a9da16db796df58c0fd96e2a6632\": container with ID starting with 2ed09b820fbe70001125d56d2b3c3d5b1c52a9da16db796df58c0fd96e2a6632 not found: ID does not exist" containerID="2ed09b820fbe70001125d56d2b3c3d5b1c52a9da16db796df58c0fd96e2a6632" Oct 02 07:13:10 crc kubenswrapper[4786]: I1002 07:13:10.433560 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ed09b820fbe70001125d56d2b3c3d5b1c52a9da16db796df58c0fd96e2a6632"} err="failed to get container status \"2ed09b820fbe70001125d56d2b3c3d5b1c52a9da16db796df58c0fd96e2a6632\": rpc error: code = NotFound desc = could not find container \"2ed09b820fbe70001125d56d2b3c3d5b1c52a9da16db796df58c0fd96e2a6632\": container with ID starting with 2ed09b820fbe70001125d56d2b3c3d5b1c52a9da16db796df58c0fd96e2a6632 not found: ID does not exist" Oct 02 07:13:10 crc kubenswrapper[4786]: I1002 07:13:10.433582 4786 scope.go:117] "RemoveContainer" containerID="e3a5b076c74bd95e2f3fd158d17fcdf48d8cdf0a4d540edfd921d7a38e6622b2" Oct 02 07:13:10 crc kubenswrapper[4786]: E1002 07:13:10.433894 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3a5b076c74bd95e2f3fd158d17fcdf48d8cdf0a4d540edfd921d7a38e6622b2\": container with ID starting with e3a5b076c74bd95e2f3fd158d17fcdf48d8cdf0a4d540edfd921d7a38e6622b2 not found: ID does not exist" containerID="e3a5b076c74bd95e2f3fd158d17fcdf48d8cdf0a4d540edfd921d7a38e6622b2" Oct 02 07:13:10 crc kubenswrapper[4786]: I1002 07:13:10.433914 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3a5b076c74bd95e2f3fd158d17fcdf48d8cdf0a4d540edfd921d7a38e6622b2"} err="failed to get container status \"e3a5b076c74bd95e2f3fd158d17fcdf48d8cdf0a4d540edfd921d7a38e6622b2\": rpc error: code = NotFound desc = could not find container \"e3a5b076c74bd95e2f3fd158d17fcdf48d8cdf0a4d540edfd921d7a38e6622b2\": container with ID starting with e3a5b076c74bd95e2f3fd158d17fcdf48d8cdf0a4d540edfd921d7a38e6622b2 not found: ID does not exist" Oct 02 07:13:10 crc kubenswrapper[4786]: I1002 07:13:10.433928 4786 scope.go:117] "RemoveContainer" containerID="b7e52d94a6085fcad4025da4f90074e3585d089f1f2f54579054657acc886631" Oct 02 07:13:10 crc kubenswrapper[4786]: E1002 07:13:10.434275 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7e52d94a6085fcad4025da4f90074e3585d089f1f2f54579054657acc886631\": container with ID starting with b7e52d94a6085fcad4025da4f90074e3585d089f1f2f54579054657acc886631 not found: ID does not exist" containerID="b7e52d94a6085fcad4025da4f90074e3585d089f1f2f54579054657acc886631" Oct 02 07:13:10 crc kubenswrapper[4786]: I1002 07:13:10.434314 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e52d94a6085fcad4025da4f90074e3585d089f1f2f54579054657acc886631"} err="failed to get container status \"b7e52d94a6085fcad4025da4f90074e3585d089f1f2f54579054657acc886631\": rpc error: code = NotFound desc = could not find container \"b7e52d94a6085fcad4025da4f90074e3585d089f1f2f54579054657acc886631\": container with ID starting with b7e52d94a6085fcad4025da4f90074e3585d089f1f2f54579054657acc886631 not found: ID does not exist" Oct 02 07:13:12 crc kubenswrapper[4786]: I1002 07:13:12.186994 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="869ccf91-e954-489b-9457-eddd0f68016a" path="/var/lib/kubelet/pods/869ccf91-e954-489b-9457-eddd0f68016a/volumes" Oct 02 07:13:15 crc kubenswrapper[4786]: I1002 07:13:15.179268 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:13:15 crc kubenswrapper[4786]: E1002 07:13:15.180400 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:13:17 crc kubenswrapper[4786]: I1002 07:13:17.416860 4786 generic.go:334] "Generic (PLEG): container finished" podID="169def9f-29e2-41fb-bf34-86464f366256" containerID="b8813c885f87b90e15bb3407b9b8d600e8472291bf9e6ff6c747f84c93c14e69" exitCode=0 Oct 02 07:13:17 crc kubenswrapper[4786]: I1002 07:13:17.416881 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp" event={"ID":"169def9f-29e2-41fb-bf34-86464f366256","Type":"ContainerDied","Data":"b8813c885f87b90e15bb3407b9b8d600e8472291bf9e6ff6c747f84c93c14e69"} Oct 02 07:13:18 crc kubenswrapper[4786]: I1002 07:13:18.719498 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp" Oct 02 07:13:18 crc kubenswrapper[4786]: I1002 07:13:18.827821 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95rl6\" (UniqueName: \"kubernetes.io/projected/169def9f-29e2-41fb-bf34-86464f366256-kube-api-access-95rl6\") pod \"169def9f-29e2-41fb-bf34-86464f366256\" (UID: \"169def9f-29e2-41fb-bf34-86464f366256\") " Oct 02 07:13:18 crc kubenswrapper[4786]: I1002 07:13:18.828032 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/169def9f-29e2-41fb-bf34-86464f366256-inventory\") pod \"169def9f-29e2-41fb-bf34-86464f366256\" (UID: \"169def9f-29e2-41fb-bf34-86464f366256\") " Oct 02 07:13:18 crc kubenswrapper[4786]: I1002 07:13:18.828121 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/169def9f-29e2-41fb-bf34-86464f366256-ssh-key\") pod \"169def9f-29e2-41fb-bf34-86464f366256\" (UID: \"169def9f-29e2-41fb-bf34-86464f366256\") " Oct 02 07:13:18 crc kubenswrapper[4786]: I1002 07:13:18.831976 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/169def9f-29e2-41fb-bf34-86464f366256-kube-api-access-95rl6" (OuterVolumeSpecName: "kube-api-access-95rl6") pod "169def9f-29e2-41fb-bf34-86464f366256" (UID: "169def9f-29e2-41fb-bf34-86464f366256"). InnerVolumeSpecName "kube-api-access-95rl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:13:18 crc kubenswrapper[4786]: I1002 07:13:18.847519 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/169def9f-29e2-41fb-bf34-86464f366256-inventory" (OuterVolumeSpecName: "inventory") pod "169def9f-29e2-41fb-bf34-86464f366256" (UID: "169def9f-29e2-41fb-bf34-86464f366256"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:13:18 crc kubenswrapper[4786]: I1002 07:13:18.848998 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/169def9f-29e2-41fb-bf34-86464f366256-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "169def9f-29e2-41fb-bf34-86464f366256" (UID: "169def9f-29e2-41fb-bf34-86464f366256"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:13:18 crc kubenswrapper[4786]: I1002 07:13:18.929836 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95rl6\" (UniqueName: \"kubernetes.io/projected/169def9f-29e2-41fb-bf34-86464f366256-kube-api-access-95rl6\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:18 crc kubenswrapper[4786]: I1002 07:13:18.929862 4786 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/169def9f-29e2-41fb-bf34-86464f366256-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:18 crc kubenswrapper[4786]: I1002 07:13:18.929871 4786 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/169def9f-29e2-41fb-bf34-86464f366256-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.429524 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp" event={"ID":"169def9f-29e2-41fb-bf34-86464f366256","Type":"ContainerDied","Data":"50c6e90e427e91dd58cae2556321b1e517776e559a5c2081796f30e75d991113"} Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.429555 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.429565 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50c6e90e427e91dd58cae2556321b1e517776e559a5c2081796f30e75d991113" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.483142 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ck7z4"] Oct 02 07:13:19 crc kubenswrapper[4786]: E1002 07:13:19.483518 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7b8f881-bb98-4622-8c6a-59bc7f55c342" containerName="extract-utilities" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.483536 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7b8f881-bb98-4622-8c6a-59bc7f55c342" containerName="extract-utilities" Oct 02 07:13:19 crc kubenswrapper[4786]: E1002 07:13:19.483554 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7b8f881-bb98-4622-8c6a-59bc7f55c342" containerName="extract-content" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.483559 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7b8f881-bb98-4622-8c6a-59bc7f55c342" containerName="extract-content" Oct 02 07:13:19 crc kubenswrapper[4786]: E1002 07:13:19.483571 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869ccf91-e954-489b-9457-eddd0f68016a" containerName="extract-content" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.483577 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="869ccf91-e954-489b-9457-eddd0f68016a" containerName="extract-content" Oct 02 07:13:19 crc kubenswrapper[4786]: E1002 07:13:19.483601 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869ccf91-e954-489b-9457-eddd0f68016a" containerName="extract-utilities" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.483606 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="869ccf91-e954-489b-9457-eddd0f68016a" containerName="extract-utilities" Oct 02 07:13:19 crc kubenswrapper[4786]: E1002 07:13:19.483615 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869ccf91-e954-489b-9457-eddd0f68016a" containerName="registry-server" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.483620 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="869ccf91-e954-489b-9457-eddd0f68016a" containerName="registry-server" Oct 02 07:13:19 crc kubenswrapper[4786]: E1002 07:13:19.483627 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="169def9f-29e2-41fb-bf34-86464f366256" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.483633 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="169def9f-29e2-41fb-bf34-86464f366256" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 07:13:19 crc kubenswrapper[4786]: E1002 07:13:19.483644 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7b8f881-bb98-4622-8c6a-59bc7f55c342" containerName="registry-server" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.483649 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7b8f881-bb98-4622-8c6a-59bc7f55c342" containerName="registry-server" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.483844 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7b8f881-bb98-4622-8c6a-59bc7f55c342" containerName="registry-server" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.483872 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="169def9f-29e2-41fb-bf34-86464f366256" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.483884 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="869ccf91-e954-489b-9457-eddd0f68016a" containerName="registry-server" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.484433 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ck7z4" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.488521 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.488736 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.489148 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hxdbp" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.489257 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.493834 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ck7z4"] Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.640815 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjd6n\" (UniqueName: \"kubernetes.io/projected/d54f4fa9-80d2-47dc-a156-b26cbc9ebd84-kube-api-access-xjd6n\") pod \"ssh-known-hosts-edpm-deployment-ck7z4\" (UID: \"d54f4fa9-80d2-47dc-a156-b26cbc9ebd84\") " pod="openstack/ssh-known-hosts-edpm-deployment-ck7z4" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.641010 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d54f4fa9-80d2-47dc-a156-b26cbc9ebd84-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ck7z4\" (UID: \"d54f4fa9-80d2-47dc-a156-b26cbc9ebd84\") " pod="openstack/ssh-known-hosts-edpm-deployment-ck7z4" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.641055 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d54f4fa9-80d2-47dc-a156-b26cbc9ebd84-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ck7z4\" (UID: \"d54f4fa9-80d2-47dc-a156-b26cbc9ebd84\") " pod="openstack/ssh-known-hosts-edpm-deployment-ck7z4" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.742267 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjd6n\" (UniqueName: \"kubernetes.io/projected/d54f4fa9-80d2-47dc-a156-b26cbc9ebd84-kube-api-access-xjd6n\") pod \"ssh-known-hosts-edpm-deployment-ck7z4\" (UID: \"d54f4fa9-80d2-47dc-a156-b26cbc9ebd84\") " pod="openstack/ssh-known-hosts-edpm-deployment-ck7z4" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.742371 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d54f4fa9-80d2-47dc-a156-b26cbc9ebd84-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ck7z4\" (UID: \"d54f4fa9-80d2-47dc-a156-b26cbc9ebd84\") " pod="openstack/ssh-known-hosts-edpm-deployment-ck7z4" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.742389 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d54f4fa9-80d2-47dc-a156-b26cbc9ebd84-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ck7z4\" (UID: \"d54f4fa9-80d2-47dc-a156-b26cbc9ebd84\") " pod="openstack/ssh-known-hosts-edpm-deployment-ck7z4" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.745534 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d54f4fa9-80d2-47dc-a156-b26cbc9ebd84-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ck7z4\" (UID: \"d54f4fa9-80d2-47dc-a156-b26cbc9ebd84\") " pod="openstack/ssh-known-hosts-edpm-deployment-ck7z4" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.746157 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d54f4fa9-80d2-47dc-a156-b26cbc9ebd84-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ck7z4\" (UID: \"d54f4fa9-80d2-47dc-a156-b26cbc9ebd84\") " pod="openstack/ssh-known-hosts-edpm-deployment-ck7z4" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.755473 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjd6n\" (UniqueName: \"kubernetes.io/projected/d54f4fa9-80d2-47dc-a156-b26cbc9ebd84-kube-api-access-xjd6n\") pod \"ssh-known-hosts-edpm-deployment-ck7z4\" (UID: \"d54f4fa9-80d2-47dc-a156-b26cbc9ebd84\") " pod="openstack/ssh-known-hosts-edpm-deployment-ck7z4" Oct 02 07:13:19 crc kubenswrapper[4786]: I1002 07:13:19.801908 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ck7z4" Oct 02 07:13:20 crc kubenswrapper[4786]: I1002 07:13:20.206654 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ck7z4"] Oct 02 07:13:20 crc kubenswrapper[4786]: I1002 07:13:20.437468 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ck7z4" event={"ID":"d54f4fa9-80d2-47dc-a156-b26cbc9ebd84","Type":"ContainerStarted","Data":"5183d814c938418782011f1ca79d8a66c0c5b0dfc19e0f760f46ff280f48a656"} Oct 02 07:13:20 crc kubenswrapper[4786]: I1002 07:13:20.503141 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-58q88"] Oct 02 07:13:20 crc kubenswrapper[4786]: I1002 07:13:20.504802 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58q88" Oct 02 07:13:20 crc kubenswrapper[4786]: I1002 07:13:20.514059 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-58q88"] Oct 02 07:13:20 crc kubenswrapper[4786]: I1002 07:13:20.656332 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17595804-2fb0-458d-ba28-2ff0542e1a33-catalog-content\") pod \"certified-operators-58q88\" (UID: \"17595804-2fb0-458d-ba28-2ff0542e1a33\") " pod="openshift-marketplace/certified-operators-58q88" Oct 02 07:13:20 crc kubenswrapper[4786]: I1002 07:13:20.656378 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfxbn\" (UniqueName: \"kubernetes.io/projected/17595804-2fb0-458d-ba28-2ff0542e1a33-kube-api-access-cfxbn\") pod \"certified-operators-58q88\" (UID: \"17595804-2fb0-458d-ba28-2ff0542e1a33\") " pod="openshift-marketplace/certified-operators-58q88" Oct 02 07:13:20 crc kubenswrapper[4786]: I1002 07:13:20.656649 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17595804-2fb0-458d-ba28-2ff0542e1a33-utilities\") pod \"certified-operators-58q88\" (UID: \"17595804-2fb0-458d-ba28-2ff0542e1a33\") " pod="openshift-marketplace/certified-operators-58q88" Oct 02 07:13:20 crc kubenswrapper[4786]: I1002 07:13:20.758737 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfxbn\" (UniqueName: \"kubernetes.io/projected/17595804-2fb0-458d-ba28-2ff0542e1a33-kube-api-access-cfxbn\") pod \"certified-operators-58q88\" (UID: \"17595804-2fb0-458d-ba28-2ff0542e1a33\") " pod="openshift-marketplace/certified-operators-58q88" Oct 02 07:13:20 crc kubenswrapper[4786]: I1002 07:13:20.758875 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17595804-2fb0-458d-ba28-2ff0542e1a33-utilities\") pod \"certified-operators-58q88\" (UID: \"17595804-2fb0-458d-ba28-2ff0542e1a33\") " pod="openshift-marketplace/certified-operators-58q88" Oct 02 07:13:20 crc kubenswrapper[4786]: I1002 07:13:20.759059 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17595804-2fb0-458d-ba28-2ff0542e1a33-catalog-content\") pod \"certified-operators-58q88\" (UID: \"17595804-2fb0-458d-ba28-2ff0542e1a33\") " pod="openshift-marketplace/certified-operators-58q88" Oct 02 07:13:20 crc kubenswrapper[4786]: I1002 07:13:20.759649 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17595804-2fb0-458d-ba28-2ff0542e1a33-utilities\") pod \"certified-operators-58q88\" (UID: \"17595804-2fb0-458d-ba28-2ff0542e1a33\") " pod="openshift-marketplace/certified-operators-58q88" Oct 02 07:13:20 crc kubenswrapper[4786]: I1002 07:13:20.759729 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17595804-2fb0-458d-ba28-2ff0542e1a33-catalog-content\") pod \"certified-operators-58q88\" (UID: \"17595804-2fb0-458d-ba28-2ff0542e1a33\") " pod="openshift-marketplace/certified-operators-58q88" Oct 02 07:13:20 crc kubenswrapper[4786]: I1002 07:13:20.773890 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfxbn\" (UniqueName: \"kubernetes.io/projected/17595804-2fb0-458d-ba28-2ff0542e1a33-kube-api-access-cfxbn\") pod \"certified-operators-58q88\" (UID: \"17595804-2fb0-458d-ba28-2ff0542e1a33\") " pod="openshift-marketplace/certified-operators-58q88" Oct 02 07:13:20 crc kubenswrapper[4786]: I1002 07:13:20.821549 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58q88" Oct 02 07:13:21 crc kubenswrapper[4786]: I1002 07:13:21.242285 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-58q88"] Oct 02 07:13:21 crc kubenswrapper[4786]: W1002 07:13:21.247024 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17595804_2fb0_458d_ba28_2ff0542e1a33.slice/crio-a2a643749bddcefe98a8d1bc5b229f421f3bead136afa410814b80987559d7ca WatchSource:0}: Error finding container a2a643749bddcefe98a8d1bc5b229f421f3bead136afa410814b80987559d7ca: Status 404 returned error can't find the container with id a2a643749bddcefe98a8d1bc5b229f421f3bead136afa410814b80987559d7ca Oct 02 07:13:21 crc kubenswrapper[4786]: I1002 07:13:21.444896 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ck7z4" event={"ID":"d54f4fa9-80d2-47dc-a156-b26cbc9ebd84","Type":"ContainerStarted","Data":"c3f3a465967ab3a2b95e43616320f2faea1a4680f19eb3485194248e1051e313"} Oct 02 07:13:21 crc kubenswrapper[4786]: I1002 07:13:21.448116 4786 generic.go:334] "Generic (PLEG): container finished" podID="17595804-2fb0-458d-ba28-2ff0542e1a33" containerID="6071e3f72b193c29ce107ab7551620f2162ba5c0573407f35ee6c9d773704d53" exitCode=0 Oct 02 07:13:21 crc kubenswrapper[4786]: I1002 07:13:21.448145 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58q88" event={"ID":"17595804-2fb0-458d-ba28-2ff0542e1a33","Type":"ContainerDied","Data":"6071e3f72b193c29ce107ab7551620f2162ba5c0573407f35ee6c9d773704d53"} Oct 02 07:13:21 crc kubenswrapper[4786]: I1002 07:13:21.448159 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58q88" event={"ID":"17595804-2fb0-458d-ba28-2ff0542e1a33","Type":"ContainerStarted","Data":"a2a643749bddcefe98a8d1bc5b229f421f3bead136afa410814b80987559d7ca"} Oct 02 07:13:21 crc kubenswrapper[4786]: I1002 07:13:21.463494 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-ck7z4" podStartSLOduration=1.6765750480000001 podStartE2EDuration="2.463482834s" podCreationTimestamp="2025-10-02 07:13:19 +0000 UTC" firstStartedPulling="2025-10-02 07:13:20.208735618 +0000 UTC m=+1610.329918749" lastFinishedPulling="2025-10-02 07:13:20.995643404 +0000 UTC m=+1611.116826535" observedRunningTime="2025-10-02 07:13:21.457879406 +0000 UTC m=+1611.579062548" watchObservedRunningTime="2025-10-02 07:13:21.463482834 +0000 UTC m=+1611.584665965" Oct 02 07:13:22 crc kubenswrapper[4786]: I1002 07:13:22.457281 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58q88" event={"ID":"17595804-2fb0-458d-ba28-2ff0542e1a33","Type":"ContainerStarted","Data":"3211365b0010348e958aa20a69281060aa2b8fd3dfe115672e235b3d317ce2e5"} Oct 02 07:13:23 crc kubenswrapper[4786]: I1002 07:13:23.466276 4786 generic.go:334] "Generic (PLEG): container finished" podID="17595804-2fb0-458d-ba28-2ff0542e1a33" containerID="3211365b0010348e958aa20a69281060aa2b8fd3dfe115672e235b3d317ce2e5" exitCode=0 Oct 02 07:13:23 crc kubenswrapper[4786]: I1002 07:13:23.466315 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58q88" event={"ID":"17595804-2fb0-458d-ba28-2ff0542e1a33","Type":"ContainerDied","Data":"3211365b0010348e958aa20a69281060aa2b8fd3dfe115672e235b3d317ce2e5"} Oct 02 07:13:24 crc kubenswrapper[4786]: I1002 07:13:24.486968 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58q88" event={"ID":"17595804-2fb0-458d-ba28-2ff0542e1a33","Type":"ContainerStarted","Data":"ce9f8e6d577846f2c6ac84bbfe71baafa5551969794bb6aa8add868e2adf3eb2"} Oct 02 07:13:24 crc kubenswrapper[4786]: I1002 07:13:24.505476 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-58q88" podStartSLOduration=2.021614585 podStartE2EDuration="4.505464046s" podCreationTimestamp="2025-10-02 07:13:20 +0000 UTC" firstStartedPulling="2025-10-02 07:13:21.449139044 +0000 UTC m=+1611.570322175" lastFinishedPulling="2025-10-02 07:13:23.932988505 +0000 UTC m=+1614.054171636" observedRunningTime="2025-10-02 07:13:24.5003913 +0000 UTC m=+1614.621574441" watchObservedRunningTime="2025-10-02 07:13:24.505464046 +0000 UTC m=+1614.626647177" Oct 02 07:13:26 crc kubenswrapper[4786]: I1002 07:13:26.499966 4786 generic.go:334] "Generic (PLEG): container finished" podID="d54f4fa9-80d2-47dc-a156-b26cbc9ebd84" containerID="c3f3a465967ab3a2b95e43616320f2faea1a4680f19eb3485194248e1051e313" exitCode=0 Oct 02 07:13:26 crc kubenswrapper[4786]: I1002 07:13:26.500055 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ck7z4" event={"ID":"d54f4fa9-80d2-47dc-a156-b26cbc9ebd84","Type":"ContainerDied","Data":"c3f3a465967ab3a2b95e43616320f2faea1a4680f19eb3485194248e1051e313"} Oct 02 07:13:27 crc kubenswrapper[4786]: I1002 07:13:27.178575 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:13:27 crc kubenswrapper[4786]: E1002 07:13:27.178992 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:13:27 crc kubenswrapper[4786]: I1002 07:13:27.801717 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ck7z4" Oct 02 07:13:27 crc kubenswrapper[4786]: I1002 07:13:27.858600 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d54f4fa9-80d2-47dc-a156-b26cbc9ebd84-ssh-key-openstack-edpm-ipam\") pod \"d54f4fa9-80d2-47dc-a156-b26cbc9ebd84\" (UID: \"d54f4fa9-80d2-47dc-a156-b26cbc9ebd84\") " Oct 02 07:13:27 crc kubenswrapper[4786]: I1002 07:13:27.858717 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d54f4fa9-80d2-47dc-a156-b26cbc9ebd84-inventory-0\") pod \"d54f4fa9-80d2-47dc-a156-b26cbc9ebd84\" (UID: \"d54f4fa9-80d2-47dc-a156-b26cbc9ebd84\") " Oct 02 07:13:27 crc kubenswrapper[4786]: I1002 07:13:27.858750 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjd6n\" (UniqueName: \"kubernetes.io/projected/d54f4fa9-80d2-47dc-a156-b26cbc9ebd84-kube-api-access-xjd6n\") pod \"d54f4fa9-80d2-47dc-a156-b26cbc9ebd84\" (UID: \"d54f4fa9-80d2-47dc-a156-b26cbc9ebd84\") " Oct 02 07:13:27 crc kubenswrapper[4786]: I1002 07:13:27.862745 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d54f4fa9-80d2-47dc-a156-b26cbc9ebd84-kube-api-access-xjd6n" (OuterVolumeSpecName: "kube-api-access-xjd6n") pod "d54f4fa9-80d2-47dc-a156-b26cbc9ebd84" (UID: "d54f4fa9-80d2-47dc-a156-b26cbc9ebd84"). InnerVolumeSpecName "kube-api-access-xjd6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:13:27 crc kubenswrapper[4786]: I1002 07:13:27.877884 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d54f4fa9-80d2-47dc-a156-b26cbc9ebd84-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d54f4fa9-80d2-47dc-a156-b26cbc9ebd84" (UID: "d54f4fa9-80d2-47dc-a156-b26cbc9ebd84"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:13:27 crc kubenswrapper[4786]: I1002 07:13:27.878309 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d54f4fa9-80d2-47dc-a156-b26cbc9ebd84-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "d54f4fa9-80d2-47dc-a156-b26cbc9ebd84" (UID: "d54f4fa9-80d2-47dc-a156-b26cbc9ebd84"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:13:27 crc kubenswrapper[4786]: I1002 07:13:27.960143 4786 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d54f4fa9-80d2-47dc-a156-b26cbc9ebd84-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:27 crc kubenswrapper[4786]: I1002 07:13:27.960174 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjd6n\" (UniqueName: \"kubernetes.io/projected/d54f4fa9-80d2-47dc-a156-b26cbc9ebd84-kube-api-access-xjd6n\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:27 crc kubenswrapper[4786]: I1002 07:13:27.960186 4786 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d54f4fa9-80d2-47dc-a156-b26cbc9ebd84-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.513113 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ck7z4" event={"ID":"d54f4fa9-80d2-47dc-a156-b26cbc9ebd84","Type":"ContainerDied","Data":"5183d814c938418782011f1ca79d8a66c0c5b0dfc19e0f760f46ff280f48a656"} Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.513148 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5183d814c938418782011f1ca79d8a66c0c5b0dfc19e0f760f46ff280f48a656" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.513156 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ck7z4" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.555017 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl"] Oct 02 07:13:28 crc kubenswrapper[4786]: E1002 07:13:28.555432 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54f4fa9-80d2-47dc-a156-b26cbc9ebd84" containerName="ssh-known-hosts-edpm-deployment" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.555449 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54f4fa9-80d2-47dc-a156-b26cbc9ebd84" containerName="ssh-known-hosts-edpm-deployment" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.555704 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d54f4fa9-80d2-47dc-a156-b26cbc9ebd84" containerName="ssh-known-hosts-edpm-deployment" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.556260 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.558918 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.558925 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.558952 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.559205 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hxdbp" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.567851 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3fe8db1d-ac2c-4028-a843-a74d8e787543-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8hhjl\" (UID: \"3fe8db1d-ac2c-4028-a843-a74d8e787543\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.567916 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fe8db1d-ac2c-4028-a843-a74d8e787543-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8hhjl\" (UID: \"3fe8db1d-ac2c-4028-a843-a74d8e787543\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.567941 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52gpd\" (UniqueName: \"kubernetes.io/projected/3fe8db1d-ac2c-4028-a843-a74d8e787543-kube-api-access-52gpd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8hhjl\" (UID: \"3fe8db1d-ac2c-4028-a843-a74d8e787543\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.569382 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl"] Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.669536 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3fe8db1d-ac2c-4028-a843-a74d8e787543-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8hhjl\" (UID: \"3fe8db1d-ac2c-4028-a843-a74d8e787543\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.669595 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fe8db1d-ac2c-4028-a843-a74d8e787543-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8hhjl\" (UID: \"3fe8db1d-ac2c-4028-a843-a74d8e787543\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.669619 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52gpd\" (UniqueName: \"kubernetes.io/projected/3fe8db1d-ac2c-4028-a843-a74d8e787543-kube-api-access-52gpd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8hhjl\" (UID: \"3fe8db1d-ac2c-4028-a843-a74d8e787543\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.672726 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3fe8db1d-ac2c-4028-a843-a74d8e787543-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8hhjl\" (UID: \"3fe8db1d-ac2c-4028-a843-a74d8e787543\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.673298 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fe8db1d-ac2c-4028-a843-a74d8e787543-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8hhjl\" (UID: \"3fe8db1d-ac2c-4028-a843-a74d8e787543\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.683288 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52gpd\" (UniqueName: \"kubernetes.io/projected/3fe8db1d-ac2c-4028-a843-a74d8e787543-kube-api-access-52gpd\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8hhjl\" (UID: \"3fe8db1d-ac2c-4028-a843-a74d8e787543\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl" Oct 02 07:13:28 crc kubenswrapper[4786]: I1002 07:13:28.868036 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl" Oct 02 07:13:29 crc kubenswrapper[4786]: I1002 07:13:29.272955 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl"] Oct 02 07:13:29 crc kubenswrapper[4786]: W1002 07:13:29.275605 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fe8db1d_ac2c_4028_a843_a74d8e787543.slice/crio-29548592d5be1a0180390862df85df79350f83804dc7e43e0752d3b9364f08de WatchSource:0}: Error finding container 29548592d5be1a0180390862df85df79350f83804dc7e43e0752d3b9364f08de: Status 404 returned error can't find the container with id 29548592d5be1a0180390862df85df79350f83804dc7e43e0752d3b9364f08de Oct 02 07:13:29 crc kubenswrapper[4786]: I1002 07:13:29.520518 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl" event={"ID":"3fe8db1d-ac2c-4028-a843-a74d8e787543","Type":"ContainerStarted","Data":"29548592d5be1a0180390862df85df79350f83804dc7e43e0752d3b9364f08de"} Oct 02 07:13:30 crc kubenswrapper[4786]: I1002 07:13:30.527147 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl" event={"ID":"3fe8db1d-ac2c-4028-a843-a74d8e787543","Type":"ContainerStarted","Data":"e88e536c4b2337cb4331e1d403911ee690506d15de759288c9a5ec722b3ccaff"} Oct 02 07:13:30 crc kubenswrapper[4786]: I1002 07:13:30.542412 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl" podStartSLOduration=2.054017641 podStartE2EDuration="2.542399148s" podCreationTimestamp="2025-10-02 07:13:28 +0000 UTC" firstStartedPulling="2025-10-02 07:13:29.27760244 +0000 UTC m=+1619.398785572" lastFinishedPulling="2025-10-02 07:13:29.765983948 +0000 UTC m=+1619.887167079" observedRunningTime="2025-10-02 07:13:30.536022061 +0000 UTC m=+1620.657205212" watchObservedRunningTime="2025-10-02 07:13:30.542399148 +0000 UTC m=+1620.663582279" Oct 02 07:13:30 crc kubenswrapper[4786]: I1002 07:13:30.821817 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-58q88" Oct 02 07:13:30 crc kubenswrapper[4786]: I1002 07:13:30.822067 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-58q88" Oct 02 07:13:30 crc kubenswrapper[4786]: I1002 07:13:30.856227 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-58q88" Oct 02 07:13:31 crc kubenswrapper[4786]: I1002 07:13:31.567764 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-58q88" Oct 02 07:13:31 crc kubenswrapper[4786]: I1002 07:13:31.597867 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-58q88"] Oct 02 07:13:33 crc kubenswrapper[4786]: I1002 07:13:33.557619 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-58q88" podUID="17595804-2fb0-458d-ba28-2ff0542e1a33" containerName="registry-server" containerID="cri-o://ce9f8e6d577846f2c6ac84bbfe71baafa5551969794bb6aa8add868e2adf3eb2" gracePeriod=2 Oct 02 07:13:33 crc kubenswrapper[4786]: I1002 07:13:33.907815 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58q88" Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.052598 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17595804-2fb0-458d-ba28-2ff0542e1a33-utilities\") pod \"17595804-2fb0-458d-ba28-2ff0542e1a33\" (UID: \"17595804-2fb0-458d-ba28-2ff0542e1a33\") " Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.052882 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17595804-2fb0-458d-ba28-2ff0542e1a33-catalog-content\") pod \"17595804-2fb0-458d-ba28-2ff0542e1a33\" (UID: \"17595804-2fb0-458d-ba28-2ff0542e1a33\") " Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.053149 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfxbn\" (UniqueName: \"kubernetes.io/projected/17595804-2fb0-458d-ba28-2ff0542e1a33-kube-api-access-cfxbn\") pod \"17595804-2fb0-458d-ba28-2ff0542e1a33\" (UID: \"17595804-2fb0-458d-ba28-2ff0542e1a33\") " Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.053401 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17595804-2fb0-458d-ba28-2ff0542e1a33-utilities" (OuterVolumeSpecName: "utilities") pod "17595804-2fb0-458d-ba28-2ff0542e1a33" (UID: "17595804-2fb0-458d-ba28-2ff0542e1a33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.053892 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17595804-2fb0-458d-ba28-2ff0542e1a33-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.056840 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17595804-2fb0-458d-ba28-2ff0542e1a33-kube-api-access-cfxbn" (OuterVolumeSpecName: "kube-api-access-cfxbn") pod "17595804-2fb0-458d-ba28-2ff0542e1a33" (UID: "17595804-2fb0-458d-ba28-2ff0542e1a33"). InnerVolumeSpecName "kube-api-access-cfxbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.082455 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17595804-2fb0-458d-ba28-2ff0542e1a33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17595804-2fb0-458d-ba28-2ff0542e1a33" (UID: "17595804-2fb0-458d-ba28-2ff0542e1a33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.155274 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfxbn\" (UniqueName: \"kubernetes.io/projected/17595804-2fb0-458d-ba28-2ff0542e1a33-kube-api-access-cfxbn\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.155299 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17595804-2fb0-458d-ba28-2ff0542e1a33-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.568199 4786 generic.go:334] "Generic (PLEG): container finished" podID="17595804-2fb0-458d-ba28-2ff0542e1a33" containerID="ce9f8e6d577846f2c6ac84bbfe71baafa5551969794bb6aa8add868e2adf3eb2" exitCode=0 Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.568240 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58q88" event={"ID":"17595804-2fb0-458d-ba28-2ff0542e1a33","Type":"ContainerDied","Data":"ce9f8e6d577846f2c6ac84bbfe71baafa5551969794bb6aa8add868e2adf3eb2"} Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.568266 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-58q88" event={"ID":"17595804-2fb0-458d-ba28-2ff0542e1a33","Type":"ContainerDied","Data":"a2a643749bddcefe98a8d1bc5b229f421f3bead136afa410814b80987559d7ca"} Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.568282 4786 scope.go:117] "RemoveContainer" containerID="ce9f8e6d577846f2c6ac84bbfe71baafa5551969794bb6aa8add868e2adf3eb2" Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.568401 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-58q88" Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.584749 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-58q88"] Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.587829 4786 scope.go:117] "RemoveContainer" containerID="3211365b0010348e958aa20a69281060aa2b8fd3dfe115672e235b3d317ce2e5" Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.589242 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-58q88"] Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.603764 4786 scope.go:117] "RemoveContainer" containerID="6071e3f72b193c29ce107ab7551620f2162ba5c0573407f35ee6c9d773704d53" Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.633976 4786 scope.go:117] "RemoveContainer" containerID="ce9f8e6d577846f2c6ac84bbfe71baafa5551969794bb6aa8add868e2adf3eb2" Oct 02 07:13:34 crc kubenswrapper[4786]: E1002 07:13:34.634351 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce9f8e6d577846f2c6ac84bbfe71baafa5551969794bb6aa8add868e2adf3eb2\": container with ID starting with ce9f8e6d577846f2c6ac84bbfe71baafa5551969794bb6aa8add868e2adf3eb2 not found: ID does not exist" containerID="ce9f8e6d577846f2c6ac84bbfe71baafa5551969794bb6aa8add868e2adf3eb2" Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.634390 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce9f8e6d577846f2c6ac84bbfe71baafa5551969794bb6aa8add868e2adf3eb2"} err="failed to get container status \"ce9f8e6d577846f2c6ac84bbfe71baafa5551969794bb6aa8add868e2adf3eb2\": rpc error: code = NotFound desc = could not find container \"ce9f8e6d577846f2c6ac84bbfe71baafa5551969794bb6aa8add868e2adf3eb2\": container with ID starting with ce9f8e6d577846f2c6ac84bbfe71baafa5551969794bb6aa8add868e2adf3eb2 not found: ID does not exist" Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.634417 4786 scope.go:117] "RemoveContainer" containerID="3211365b0010348e958aa20a69281060aa2b8fd3dfe115672e235b3d317ce2e5" Oct 02 07:13:34 crc kubenswrapper[4786]: E1002 07:13:34.634773 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3211365b0010348e958aa20a69281060aa2b8fd3dfe115672e235b3d317ce2e5\": container with ID starting with 3211365b0010348e958aa20a69281060aa2b8fd3dfe115672e235b3d317ce2e5 not found: ID does not exist" containerID="3211365b0010348e958aa20a69281060aa2b8fd3dfe115672e235b3d317ce2e5" Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.634810 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3211365b0010348e958aa20a69281060aa2b8fd3dfe115672e235b3d317ce2e5"} err="failed to get container status \"3211365b0010348e958aa20a69281060aa2b8fd3dfe115672e235b3d317ce2e5\": rpc error: code = NotFound desc = could not find container \"3211365b0010348e958aa20a69281060aa2b8fd3dfe115672e235b3d317ce2e5\": container with ID starting with 3211365b0010348e958aa20a69281060aa2b8fd3dfe115672e235b3d317ce2e5 not found: ID does not exist" Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.634839 4786 scope.go:117] "RemoveContainer" containerID="6071e3f72b193c29ce107ab7551620f2162ba5c0573407f35ee6c9d773704d53" Oct 02 07:13:34 crc kubenswrapper[4786]: E1002 07:13:34.635149 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6071e3f72b193c29ce107ab7551620f2162ba5c0573407f35ee6c9d773704d53\": container with ID starting with 6071e3f72b193c29ce107ab7551620f2162ba5c0573407f35ee6c9d773704d53 not found: ID does not exist" containerID="6071e3f72b193c29ce107ab7551620f2162ba5c0573407f35ee6c9d773704d53" Oct 02 07:13:34 crc kubenswrapper[4786]: I1002 07:13:34.635195 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6071e3f72b193c29ce107ab7551620f2162ba5c0573407f35ee6c9d773704d53"} err="failed to get container status \"6071e3f72b193c29ce107ab7551620f2162ba5c0573407f35ee6c9d773704d53\": rpc error: code = NotFound desc = could not find container \"6071e3f72b193c29ce107ab7551620f2162ba5c0573407f35ee6c9d773704d53\": container with ID starting with 6071e3f72b193c29ce107ab7551620f2162ba5c0573407f35ee6c9d773704d53 not found: ID does not exist" Oct 02 07:13:35 crc kubenswrapper[4786]: I1002 07:13:35.575210 4786 generic.go:334] "Generic (PLEG): container finished" podID="3fe8db1d-ac2c-4028-a843-a74d8e787543" containerID="e88e536c4b2337cb4331e1d403911ee690506d15de759288c9a5ec722b3ccaff" exitCode=0 Oct 02 07:13:35 crc kubenswrapper[4786]: I1002 07:13:35.575245 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl" event={"ID":"3fe8db1d-ac2c-4028-a843-a74d8e787543","Type":"ContainerDied","Data":"e88e536c4b2337cb4331e1d403911ee690506d15de759288c9a5ec722b3ccaff"} Oct 02 07:13:36 crc kubenswrapper[4786]: I1002 07:13:36.186798 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17595804-2fb0-458d-ba28-2ff0542e1a33" path="/var/lib/kubelet/pods/17595804-2fb0-458d-ba28-2ff0542e1a33/volumes" Oct 02 07:13:36 crc kubenswrapper[4786]: I1002 07:13:36.867856 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl" Oct 02 07:13:36 crc kubenswrapper[4786]: I1002 07:13:36.893102 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fe8db1d-ac2c-4028-a843-a74d8e787543-inventory\") pod \"3fe8db1d-ac2c-4028-a843-a74d8e787543\" (UID: \"3fe8db1d-ac2c-4028-a843-a74d8e787543\") " Oct 02 07:13:36 crc kubenswrapper[4786]: I1002 07:13:36.893215 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3fe8db1d-ac2c-4028-a843-a74d8e787543-ssh-key\") pod \"3fe8db1d-ac2c-4028-a843-a74d8e787543\" (UID: \"3fe8db1d-ac2c-4028-a843-a74d8e787543\") " Oct 02 07:13:36 crc kubenswrapper[4786]: I1002 07:13:36.893245 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52gpd\" (UniqueName: \"kubernetes.io/projected/3fe8db1d-ac2c-4028-a843-a74d8e787543-kube-api-access-52gpd\") pod \"3fe8db1d-ac2c-4028-a843-a74d8e787543\" (UID: \"3fe8db1d-ac2c-4028-a843-a74d8e787543\") " Oct 02 07:13:36 crc kubenswrapper[4786]: I1002 07:13:36.897164 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fe8db1d-ac2c-4028-a843-a74d8e787543-kube-api-access-52gpd" (OuterVolumeSpecName: "kube-api-access-52gpd") pod "3fe8db1d-ac2c-4028-a843-a74d8e787543" (UID: "3fe8db1d-ac2c-4028-a843-a74d8e787543"). InnerVolumeSpecName "kube-api-access-52gpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:13:36 crc kubenswrapper[4786]: I1002 07:13:36.912870 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fe8db1d-ac2c-4028-a843-a74d8e787543-inventory" (OuterVolumeSpecName: "inventory") pod "3fe8db1d-ac2c-4028-a843-a74d8e787543" (UID: "3fe8db1d-ac2c-4028-a843-a74d8e787543"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:13:36 crc kubenswrapper[4786]: I1002 07:13:36.913821 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fe8db1d-ac2c-4028-a843-a74d8e787543-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3fe8db1d-ac2c-4028-a843-a74d8e787543" (UID: "3fe8db1d-ac2c-4028-a843-a74d8e787543"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:13:36 crc kubenswrapper[4786]: I1002 07:13:36.994657 4786 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fe8db1d-ac2c-4028-a843-a74d8e787543-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:36 crc kubenswrapper[4786]: I1002 07:13:36.994777 4786 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3fe8db1d-ac2c-4028-a843-a74d8e787543-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:36 crc kubenswrapper[4786]: I1002 07:13:36.994871 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52gpd\" (UniqueName: \"kubernetes.io/projected/3fe8db1d-ac2c-4028-a843-a74d8e787543-kube-api-access-52gpd\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.588430 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl" event={"ID":"3fe8db1d-ac2c-4028-a843-a74d8e787543","Type":"ContainerDied","Data":"29548592d5be1a0180390862df85df79350f83804dc7e43e0752d3b9364f08de"} Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.588792 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29548592d5be1a0180390862df85df79350f83804dc7e43e0752d3b9364f08de" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.588464 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8hhjl" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.633297 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf"] Oct 02 07:13:37 crc kubenswrapper[4786]: E1002 07:13:37.633645 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17595804-2fb0-458d-ba28-2ff0542e1a33" containerName="registry-server" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.633662 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="17595804-2fb0-458d-ba28-2ff0542e1a33" containerName="registry-server" Oct 02 07:13:37 crc kubenswrapper[4786]: E1002 07:13:37.633682 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17595804-2fb0-458d-ba28-2ff0542e1a33" containerName="extract-utilities" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.633700 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="17595804-2fb0-458d-ba28-2ff0542e1a33" containerName="extract-utilities" Oct 02 07:13:37 crc kubenswrapper[4786]: E1002 07:13:37.633715 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fe8db1d-ac2c-4028-a843-a74d8e787543" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.633722 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fe8db1d-ac2c-4028-a843-a74d8e787543" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 02 07:13:37 crc kubenswrapper[4786]: E1002 07:13:37.633742 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17595804-2fb0-458d-ba28-2ff0542e1a33" containerName="extract-content" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.633749 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="17595804-2fb0-458d-ba28-2ff0542e1a33" containerName="extract-content" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.633955 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="17595804-2fb0-458d-ba28-2ff0542e1a33" containerName="registry-server" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.633978 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fe8db1d-ac2c-4028-a843-a74d8e787543" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.634518 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.636815 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.639270 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf"] Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.639720 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.639811 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.641460 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hxdbp" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.807237 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l6w4\" (UniqueName: \"kubernetes.io/projected/1f6f8032-e0d0-460f-bc14-653a74481964-kube-api-access-8l6w4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf\" (UID: \"1f6f8032-e0d0-460f-bc14-653a74481964\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.807636 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f6f8032-e0d0-460f-bc14-653a74481964-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf\" (UID: \"1f6f8032-e0d0-460f-bc14-653a74481964\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.807707 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f6f8032-e0d0-460f-bc14-653a74481964-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf\" (UID: \"1f6f8032-e0d0-460f-bc14-653a74481964\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.909040 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f6f8032-e0d0-460f-bc14-653a74481964-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf\" (UID: \"1f6f8032-e0d0-460f-bc14-653a74481964\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.909084 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f6f8032-e0d0-460f-bc14-653a74481964-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf\" (UID: \"1f6f8032-e0d0-460f-bc14-653a74481964\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.909145 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l6w4\" (UniqueName: \"kubernetes.io/projected/1f6f8032-e0d0-460f-bc14-653a74481964-kube-api-access-8l6w4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf\" (UID: \"1f6f8032-e0d0-460f-bc14-653a74481964\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.911830 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f6f8032-e0d0-460f-bc14-653a74481964-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf\" (UID: \"1f6f8032-e0d0-460f-bc14-653a74481964\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.911838 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f6f8032-e0d0-460f-bc14-653a74481964-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf\" (UID: \"1f6f8032-e0d0-460f-bc14-653a74481964\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.922748 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l6w4\" (UniqueName: \"kubernetes.io/projected/1f6f8032-e0d0-460f-bc14-653a74481964-kube-api-access-8l6w4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf\" (UID: \"1f6f8032-e0d0-460f-bc14-653a74481964\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf" Oct 02 07:13:37 crc kubenswrapper[4786]: I1002 07:13:37.949021 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf" Oct 02 07:13:38 crc kubenswrapper[4786]: I1002 07:13:38.345242 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf"] Oct 02 07:13:38 crc kubenswrapper[4786]: I1002 07:13:38.594630 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf" event={"ID":"1f6f8032-e0d0-460f-bc14-653a74481964","Type":"ContainerStarted","Data":"a3c0f32e4d669a8e8514ec1bfd58c1bd13207113d64dae245381a9911ac4d2aa"} Oct 02 07:13:39 crc kubenswrapper[4786]: I1002 07:13:39.116018 4786 scope.go:117] "RemoveContainer" containerID="fd599f2df437a1bffba5b17e5812f2b05418d1811e5559c815e1a2ec3193895a" Oct 02 07:13:39 crc kubenswrapper[4786]: I1002 07:13:39.147146 4786 scope.go:117] "RemoveContainer" containerID="f2bea0cf64ded2b35bd67c0ce372b50cd2bd8e4626631055b016d8c9ce305f9e" Oct 02 07:13:39 crc kubenswrapper[4786]: I1002 07:13:39.179502 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:13:39 crc kubenswrapper[4786]: E1002 07:13:39.179915 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:13:39 crc kubenswrapper[4786]: I1002 07:13:39.183441 4786 scope.go:117] "RemoveContainer" containerID="6f1063672d6933be464f6d4561355dd1715d5dbefa2a1f18e31820d72bf3cfdd" Oct 02 07:13:39 crc kubenswrapper[4786]: I1002 07:13:39.604426 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf" event={"ID":"1f6f8032-e0d0-460f-bc14-653a74481964","Type":"ContainerStarted","Data":"5ddb417fdc32a749ab15b38b741d130f448b371f608f161a6b24c64706260fbd"} Oct 02 07:13:39 crc kubenswrapper[4786]: I1002 07:13:39.621389 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf" podStartSLOduration=2.05254651 podStartE2EDuration="2.621375264s" podCreationTimestamp="2025-10-02 07:13:37 +0000 UTC" firstStartedPulling="2025-10-02 07:13:38.349528302 +0000 UTC m=+1628.470711433" lastFinishedPulling="2025-10-02 07:13:38.918357057 +0000 UTC m=+1629.039540187" observedRunningTime="2025-10-02 07:13:39.618614208 +0000 UTC m=+1629.739797349" watchObservedRunningTime="2025-10-02 07:13:39.621375264 +0000 UTC m=+1629.742558395" Oct 02 07:13:45 crc kubenswrapper[4786]: I1002 07:13:45.639580 4786 generic.go:334] "Generic (PLEG): container finished" podID="1f6f8032-e0d0-460f-bc14-653a74481964" containerID="5ddb417fdc32a749ab15b38b741d130f448b371f608f161a6b24c64706260fbd" exitCode=0 Oct 02 07:13:45 crc kubenswrapper[4786]: I1002 07:13:45.639670 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf" event={"ID":"1f6f8032-e0d0-460f-bc14-653a74481964","Type":"ContainerDied","Data":"5ddb417fdc32a749ab15b38b741d130f448b371f608f161a6b24c64706260fbd"} Oct 02 07:13:46 crc kubenswrapper[4786]: I1002 07:13:46.029039 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-c974n"] Oct 02 07:13:46 crc kubenswrapper[4786]: I1002 07:13:46.034173 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-c974n"] Oct 02 07:13:46 crc kubenswrapper[4786]: I1002 07:13:46.186839 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="181f9506-2b83-4815-b417-a5b3eff7d763" path="/var/lib/kubelet/pods/181f9506-2b83-4815-b417-a5b3eff7d763/volumes" Oct 02 07:13:46 crc kubenswrapper[4786]: I1002 07:13:46.906092 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.048175 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f6f8032-e0d0-460f-bc14-653a74481964-inventory\") pod \"1f6f8032-e0d0-460f-bc14-653a74481964\" (UID: \"1f6f8032-e0d0-460f-bc14-653a74481964\") " Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.048355 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l6w4\" (UniqueName: \"kubernetes.io/projected/1f6f8032-e0d0-460f-bc14-653a74481964-kube-api-access-8l6w4\") pod \"1f6f8032-e0d0-460f-bc14-653a74481964\" (UID: \"1f6f8032-e0d0-460f-bc14-653a74481964\") " Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.048389 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f6f8032-e0d0-460f-bc14-653a74481964-ssh-key\") pod \"1f6f8032-e0d0-460f-bc14-653a74481964\" (UID: \"1f6f8032-e0d0-460f-bc14-653a74481964\") " Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.052117 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f6f8032-e0d0-460f-bc14-653a74481964-kube-api-access-8l6w4" (OuterVolumeSpecName: "kube-api-access-8l6w4") pod "1f6f8032-e0d0-460f-bc14-653a74481964" (UID: "1f6f8032-e0d0-460f-bc14-653a74481964"). InnerVolumeSpecName "kube-api-access-8l6w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.068268 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f6f8032-e0d0-460f-bc14-653a74481964-inventory" (OuterVolumeSpecName: "inventory") pod "1f6f8032-e0d0-460f-bc14-653a74481964" (UID: "1f6f8032-e0d0-460f-bc14-653a74481964"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.068556 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f6f8032-e0d0-460f-bc14-653a74481964-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1f6f8032-e0d0-460f-bc14-653a74481964" (UID: "1f6f8032-e0d0-460f-bc14-653a74481964"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.150164 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l6w4\" (UniqueName: \"kubernetes.io/projected/1f6f8032-e0d0-460f-bc14-653a74481964-kube-api-access-8l6w4\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.150186 4786 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1f6f8032-e0d0-460f-bc14-653a74481964-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.150195 4786 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f6f8032-e0d0-460f-bc14-653a74481964-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.654710 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf" event={"ID":"1f6f8032-e0d0-460f-bc14-653a74481964","Type":"ContainerDied","Data":"a3c0f32e4d669a8e8514ec1bfd58c1bd13207113d64dae245381a9911ac4d2aa"} Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.654924 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3c0f32e4d669a8e8514ec1bfd58c1bd13207113d64dae245381a9911ac4d2aa" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.654760 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.707378 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6"] Oct 02 07:13:47 crc kubenswrapper[4786]: E1002 07:13:47.707738 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f6f8032-e0d0-460f-bc14-653a74481964" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.707758 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f6f8032-e0d0-460f-bc14-653a74481964" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.707979 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f6f8032-e0d0-460f-bc14-653a74481964" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.708526 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.711175 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.711252 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.711468 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.711713 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hxdbp" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.711741 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.711870 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.712105 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.712114 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.716335 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6"] Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.860489 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.860545 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.860575 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.860628 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.860652 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.860677 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.860709 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.860743 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.860761 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.860782 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.860824 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.860848 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.860903 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlk49\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-kube-api-access-xlk49\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.860948 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.962149 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.962229 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.962253 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.962284 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.962332 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.962375 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.962430 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlk49\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-kube-api-access-xlk49\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.962471 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.962497 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.962532 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.962556 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.962589 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.962616 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.962655 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.965786 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.966601 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.966831 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.966895 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.966929 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.966886 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.967173 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.967436 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.967803 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.968081 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.968478 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.968655 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.968774 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:47 crc kubenswrapper[4786]: I1002 07:13:47.975402 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlk49\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-kube-api-access-xlk49\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:48 crc kubenswrapper[4786]: I1002 07:13:48.021960 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:13:48 crc kubenswrapper[4786]: I1002 07:13:48.437927 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6"] Oct 02 07:13:48 crc kubenswrapper[4786]: I1002 07:13:48.661852 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" event={"ID":"2b54f55b-d53c-449b-a2e3-6ca66ae19657","Type":"ContainerStarted","Data":"31f28fe71683152b1199ab94b5d2ab859dfc0bbb98f784e51543ca83b09c6836"} Oct 02 07:13:49 crc kubenswrapper[4786]: I1002 07:13:49.676239 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" event={"ID":"2b54f55b-d53c-449b-a2e3-6ca66ae19657","Type":"ContainerStarted","Data":"f74adc1e8d3ab6f13087222c1273ce05599deb672cb2fdbd39c55f749d8cdce6"} Oct 02 07:13:49 crc kubenswrapper[4786]: I1002 07:13:49.692157 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" podStartSLOduration=2.194424152 podStartE2EDuration="2.692144701s" podCreationTimestamp="2025-10-02 07:13:47 +0000 UTC" firstStartedPulling="2025-10-02 07:13:48.440833514 +0000 UTC m=+1638.562016645" lastFinishedPulling="2025-10-02 07:13:48.938554063 +0000 UTC m=+1639.059737194" observedRunningTime="2025-10-02 07:13:49.689423899 +0000 UTC m=+1639.810607040" watchObservedRunningTime="2025-10-02 07:13:49.692144701 +0000 UTC m=+1639.813327832" Oct 02 07:13:51 crc kubenswrapper[4786]: I1002 07:13:51.179013 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:13:51 crc kubenswrapper[4786]: E1002 07:13:51.179235 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:14:04 crc kubenswrapper[4786]: I1002 07:14:04.180023 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:14:04 crc kubenswrapper[4786]: E1002 07:14:04.180956 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:14:15 crc kubenswrapper[4786]: I1002 07:14:15.858327 4786 generic.go:334] "Generic (PLEG): container finished" podID="2b54f55b-d53c-449b-a2e3-6ca66ae19657" containerID="f74adc1e8d3ab6f13087222c1273ce05599deb672cb2fdbd39c55f749d8cdce6" exitCode=0 Oct 02 07:14:15 crc kubenswrapper[4786]: I1002 07:14:15.858435 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" event={"ID":"2b54f55b-d53c-449b-a2e3-6ca66ae19657","Type":"ContainerDied","Data":"f74adc1e8d3ab6f13087222c1273ce05599deb672cb2fdbd39c55f749d8cdce6"} Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.169304 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.344466 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-inventory\") pod \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.344541 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.344575 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-libvirt-combined-ca-bundle\") pod \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.344606 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.344647 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.344701 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-ovn-combined-ca-bundle\") pod \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.344729 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-telemetry-combined-ca-bundle\") pod \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.344748 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlk49\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-kube-api-access-xlk49\") pod \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.344789 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-ssh-key\") pod \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.344803 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-nova-combined-ca-bundle\") pod \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.344852 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-repo-setup-combined-ca-bundle\") pod \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.344875 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-neutron-metadata-combined-ca-bundle\") pod \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.344927 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-bootstrap-combined-ca-bundle\") pod \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.344956 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-ovn-default-certs-0\") pod \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\" (UID: \"2b54f55b-d53c-449b-a2e3-6ca66ae19657\") " Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.349621 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "2b54f55b-d53c-449b-a2e3-6ca66ae19657" (UID: "2b54f55b-d53c-449b-a2e3-6ca66ae19657"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.350009 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2b54f55b-d53c-449b-a2e3-6ca66ae19657" (UID: "2b54f55b-d53c-449b-a2e3-6ca66ae19657"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.350038 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "2b54f55b-d53c-449b-a2e3-6ca66ae19657" (UID: "2b54f55b-d53c-449b-a2e3-6ca66ae19657"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.350792 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "2b54f55b-d53c-449b-a2e3-6ca66ae19657" (UID: "2b54f55b-d53c-449b-a2e3-6ca66ae19657"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.350905 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "2b54f55b-d53c-449b-a2e3-6ca66ae19657" (UID: "2b54f55b-d53c-449b-a2e3-6ca66ae19657"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.351009 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "2b54f55b-d53c-449b-a2e3-6ca66ae19657" (UID: "2b54f55b-d53c-449b-a2e3-6ca66ae19657"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.351420 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "2b54f55b-d53c-449b-a2e3-6ca66ae19657" (UID: "2b54f55b-d53c-449b-a2e3-6ca66ae19657"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.352377 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "2b54f55b-d53c-449b-a2e3-6ca66ae19657" (UID: "2b54f55b-d53c-449b-a2e3-6ca66ae19657"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.352389 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-kube-api-access-xlk49" (OuterVolumeSpecName: "kube-api-access-xlk49") pod "2b54f55b-d53c-449b-a2e3-6ca66ae19657" (UID: "2b54f55b-d53c-449b-a2e3-6ca66ae19657"). InnerVolumeSpecName "kube-api-access-xlk49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.352464 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "2b54f55b-d53c-449b-a2e3-6ca66ae19657" (UID: "2b54f55b-d53c-449b-a2e3-6ca66ae19657"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.352739 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "2b54f55b-d53c-449b-a2e3-6ca66ae19657" (UID: "2b54f55b-d53c-449b-a2e3-6ca66ae19657"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.352934 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "2b54f55b-d53c-449b-a2e3-6ca66ae19657" (UID: "2b54f55b-d53c-449b-a2e3-6ca66ae19657"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.365811 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-inventory" (OuterVolumeSpecName: "inventory") pod "2b54f55b-d53c-449b-a2e3-6ca66ae19657" (UID: "2b54f55b-d53c-449b-a2e3-6ca66ae19657"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.366655 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2b54f55b-d53c-449b-a2e3-6ca66ae19657" (UID: "2b54f55b-d53c-449b-a2e3-6ca66ae19657"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.447107 4786 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.447142 4786 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.447153 4786 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.447165 4786 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.447175 4786 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.447185 4786 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.447195 4786 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.447205 4786 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.447215 4786 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.447223 4786 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.447231 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlk49\" (UniqueName: \"kubernetes.io/projected/2b54f55b-d53c-449b-a2e3-6ca66ae19657-kube-api-access-xlk49\") on node \"crc\" DevicePath \"\"" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.447238 4786 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.447245 4786 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.447252 4786 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b54f55b-d53c-449b-a2e3-6ca66ae19657-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.872980 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" event={"ID":"2b54f55b-d53c-449b-a2e3-6ca66ae19657","Type":"ContainerDied","Data":"31f28fe71683152b1199ab94b5d2ab859dfc0bbb98f784e51543ca83b09c6836"} Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.873018 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31f28fe71683152b1199ab94b5d2ab859dfc0bbb98f784e51543ca83b09c6836" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.873014 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.938028 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56"] Oct 02 07:14:17 crc kubenswrapper[4786]: E1002 07:14:17.938402 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b54f55b-d53c-449b-a2e3-6ca66ae19657" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.938421 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b54f55b-d53c-449b-a2e3-6ca66ae19657" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.938609 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b54f55b-d53c-449b-a2e3-6ca66ae19657" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.939135 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.942465 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.942756 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.942795 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.943396 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.943653 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hxdbp" Oct 02 07:14:17 crc kubenswrapper[4786]: I1002 07:14:17.970834 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56"] Oct 02 07:14:18 crc kubenswrapper[4786]: I1002 07:14:18.061877 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/483f48a9-eb90-4a5b-aaa6-63e130859f16-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-47j56\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" Oct 02 07:14:18 crc kubenswrapper[4786]: I1002 07:14:18.062524 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/483f48a9-eb90-4a5b-aaa6-63e130859f16-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-47j56\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" Oct 02 07:14:18 crc kubenswrapper[4786]: I1002 07:14:18.062586 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh5ds\" (UniqueName: \"kubernetes.io/projected/483f48a9-eb90-4a5b-aaa6-63e130859f16-kube-api-access-wh5ds\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-47j56\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" Oct 02 07:14:18 crc kubenswrapper[4786]: I1002 07:14:18.062889 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483f48a9-eb90-4a5b-aaa6-63e130859f16-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-47j56\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" Oct 02 07:14:18 crc kubenswrapper[4786]: I1002 07:14:18.063010 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/483f48a9-eb90-4a5b-aaa6-63e130859f16-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-47j56\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" Oct 02 07:14:18 crc kubenswrapper[4786]: I1002 07:14:18.164291 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483f48a9-eb90-4a5b-aaa6-63e130859f16-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-47j56\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" Oct 02 07:14:18 crc kubenswrapper[4786]: I1002 07:14:18.164393 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/483f48a9-eb90-4a5b-aaa6-63e130859f16-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-47j56\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" Oct 02 07:14:18 crc kubenswrapper[4786]: I1002 07:14:18.164467 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/483f48a9-eb90-4a5b-aaa6-63e130859f16-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-47j56\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" Oct 02 07:14:18 crc kubenswrapper[4786]: I1002 07:14:18.164495 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/483f48a9-eb90-4a5b-aaa6-63e130859f16-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-47j56\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" Oct 02 07:14:18 crc kubenswrapper[4786]: I1002 07:14:18.164532 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh5ds\" (UniqueName: \"kubernetes.io/projected/483f48a9-eb90-4a5b-aaa6-63e130859f16-kube-api-access-wh5ds\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-47j56\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" Oct 02 07:14:18 crc kubenswrapper[4786]: I1002 07:14:18.165195 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/483f48a9-eb90-4a5b-aaa6-63e130859f16-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-47j56\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" Oct 02 07:14:18 crc kubenswrapper[4786]: I1002 07:14:18.167220 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/483f48a9-eb90-4a5b-aaa6-63e130859f16-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-47j56\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" Oct 02 07:14:18 crc kubenswrapper[4786]: I1002 07:14:18.167623 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483f48a9-eb90-4a5b-aaa6-63e130859f16-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-47j56\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" Oct 02 07:14:18 crc kubenswrapper[4786]: I1002 07:14:18.167985 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/483f48a9-eb90-4a5b-aaa6-63e130859f16-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-47j56\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" Oct 02 07:14:18 crc kubenswrapper[4786]: I1002 07:14:18.177644 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh5ds\" (UniqueName: \"kubernetes.io/projected/483f48a9-eb90-4a5b-aaa6-63e130859f16-kube-api-access-wh5ds\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-47j56\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" Oct 02 07:14:18 crc kubenswrapper[4786]: I1002 07:14:18.178755 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:14:18 crc kubenswrapper[4786]: E1002 07:14:18.179096 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:14:18 crc kubenswrapper[4786]: I1002 07:14:18.273366 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" Oct 02 07:14:18 crc kubenswrapper[4786]: I1002 07:14:18.701527 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56"] Oct 02 07:14:18 crc kubenswrapper[4786]: I1002 07:14:18.879641 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" event={"ID":"483f48a9-eb90-4a5b-aaa6-63e130859f16","Type":"ContainerStarted","Data":"4e71530d5e41b395504c8124cc60ad1e817a0015bed1d4ba595813fdae7446b5"} Oct 02 07:14:19 crc kubenswrapper[4786]: I1002 07:14:19.887569 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" event={"ID":"483f48a9-eb90-4a5b-aaa6-63e130859f16","Type":"ContainerStarted","Data":"5d1985f4056a026bdcdc40a7cc41d472b1e11d58989ebe66ab6f4ca8b13a4b0a"} Oct 02 07:14:19 crc kubenswrapper[4786]: I1002 07:14:19.903408 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" podStartSLOduration=2.420764702 podStartE2EDuration="2.903381098s" podCreationTimestamp="2025-10-02 07:14:17 +0000 UTC" firstStartedPulling="2025-10-02 07:14:18.706254893 +0000 UTC m=+1668.827438024" lastFinishedPulling="2025-10-02 07:14:19.188871288 +0000 UTC m=+1669.310054420" observedRunningTime="2025-10-02 07:14:19.900527456 +0000 UTC m=+1670.021710597" watchObservedRunningTime="2025-10-02 07:14:19.903381098 +0000 UTC m=+1670.024564229" Oct 02 07:14:30 crc kubenswrapper[4786]: I1002 07:14:30.184087 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:14:30 crc kubenswrapper[4786]: E1002 07:14:30.186011 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:14:39 crc kubenswrapper[4786]: I1002 07:14:39.284557 4786 scope.go:117] "RemoveContainer" containerID="b3bf505f62f0b9ea0997cd3c970ad0fc9c924141a6a43b7111eeb1beed5a8977" Oct 02 07:14:41 crc kubenswrapper[4786]: I1002 07:14:41.179890 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:14:41 crc kubenswrapper[4786]: E1002 07:14:41.180320 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:14:54 crc kubenswrapper[4786]: I1002 07:14:54.178889 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:14:54 crc kubenswrapper[4786]: E1002 07:14:54.179675 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:15:00 crc kubenswrapper[4786]: I1002 07:15:00.130288 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323155-zwqhb"] Oct 02 07:15:00 crc kubenswrapper[4786]: I1002 07:15:00.131622 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323155-zwqhb" Oct 02 07:15:00 crc kubenswrapper[4786]: I1002 07:15:00.133044 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 07:15:00 crc kubenswrapper[4786]: I1002 07:15:00.133303 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 07:15:00 crc kubenswrapper[4786]: I1002 07:15:00.148347 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323155-zwqhb"] Oct 02 07:15:00 crc kubenswrapper[4786]: I1002 07:15:00.162059 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5348404a-f352-4a0e-818c-d8633b951625-config-volume\") pod \"collect-profiles-29323155-zwqhb\" (UID: \"5348404a-f352-4a0e-818c-d8633b951625\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323155-zwqhb" Oct 02 07:15:00 crc kubenswrapper[4786]: I1002 07:15:00.162213 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k575d\" (UniqueName: \"kubernetes.io/projected/5348404a-f352-4a0e-818c-d8633b951625-kube-api-access-k575d\") pod \"collect-profiles-29323155-zwqhb\" (UID: \"5348404a-f352-4a0e-818c-d8633b951625\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323155-zwqhb" Oct 02 07:15:00 crc kubenswrapper[4786]: I1002 07:15:00.162544 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5348404a-f352-4a0e-818c-d8633b951625-secret-volume\") pod \"collect-profiles-29323155-zwqhb\" (UID: \"5348404a-f352-4a0e-818c-d8633b951625\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323155-zwqhb" Oct 02 07:15:00 crc kubenswrapper[4786]: I1002 07:15:00.264262 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k575d\" (UniqueName: \"kubernetes.io/projected/5348404a-f352-4a0e-818c-d8633b951625-kube-api-access-k575d\") pod \"collect-profiles-29323155-zwqhb\" (UID: \"5348404a-f352-4a0e-818c-d8633b951625\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323155-zwqhb" Oct 02 07:15:00 crc kubenswrapper[4786]: I1002 07:15:00.264396 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5348404a-f352-4a0e-818c-d8633b951625-secret-volume\") pod \"collect-profiles-29323155-zwqhb\" (UID: \"5348404a-f352-4a0e-818c-d8633b951625\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323155-zwqhb" Oct 02 07:15:00 crc kubenswrapper[4786]: I1002 07:15:00.264473 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5348404a-f352-4a0e-818c-d8633b951625-config-volume\") pod \"collect-profiles-29323155-zwqhb\" (UID: \"5348404a-f352-4a0e-818c-d8633b951625\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323155-zwqhb" Oct 02 07:15:00 crc kubenswrapper[4786]: I1002 07:15:00.265482 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5348404a-f352-4a0e-818c-d8633b951625-config-volume\") pod \"collect-profiles-29323155-zwqhb\" (UID: \"5348404a-f352-4a0e-818c-d8633b951625\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323155-zwqhb" Oct 02 07:15:00 crc kubenswrapper[4786]: I1002 07:15:00.268929 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5348404a-f352-4a0e-818c-d8633b951625-secret-volume\") pod \"collect-profiles-29323155-zwqhb\" (UID: \"5348404a-f352-4a0e-818c-d8633b951625\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323155-zwqhb" Oct 02 07:15:00 crc kubenswrapper[4786]: I1002 07:15:00.278597 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k575d\" (UniqueName: \"kubernetes.io/projected/5348404a-f352-4a0e-818c-d8633b951625-kube-api-access-k575d\") pod \"collect-profiles-29323155-zwqhb\" (UID: \"5348404a-f352-4a0e-818c-d8633b951625\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323155-zwqhb" Oct 02 07:15:00 crc kubenswrapper[4786]: I1002 07:15:00.447064 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323155-zwqhb" Oct 02 07:15:00 crc kubenswrapper[4786]: I1002 07:15:00.803870 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323155-zwqhb"] Oct 02 07:15:01 crc kubenswrapper[4786]: I1002 07:15:01.145615 4786 generic.go:334] "Generic (PLEG): container finished" podID="5348404a-f352-4a0e-818c-d8633b951625" containerID="89d8b35a9ff3bdb572fd4144fe52b438397d38ceaddd777ac7a41c2aa2bd076d" exitCode=0 Oct 02 07:15:01 crc kubenswrapper[4786]: I1002 07:15:01.145719 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323155-zwqhb" event={"ID":"5348404a-f352-4a0e-818c-d8633b951625","Type":"ContainerDied","Data":"89d8b35a9ff3bdb572fd4144fe52b438397d38ceaddd777ac7a41c2aa2bd076d"} Oct 02 07:15:01 crc kubenswrapper[4786]: I1002 07:15:01.145843 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323155-zwqhb" event={"ID":"5348404a-f352-4a0e-818c-d8633b951625","Type":"ContainerStarted","Data":"2fe53962608e62724bb6a475752556ed5be47901b00ffc49a3f1297ce7029a4a"} Oct 02 07:15:02 crc kubenswrapper[4786]: I1002 07:15:02.153620 4786 generic.go:334] "Generic (PLEG): container finished" podID="483f48a9-eb90-4a5b-aaa6-63e130859f16" containerID="5d1985f4056a026bdcdc40a7cc41d472b1e11d58989ebe66ab6f4ca8b13a4b0a" exitCode=0 Oct 02 07:15:02 crc kubenswrapper[4786]: I1002 07:15:02.153722 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" event={"ID":"483f48a9-eb90-4a5b-aaa6-63e130859f16","Type":"ContainerDied","Data":"5d1985f4056a026bdcdc40a7cc41d472b1e11d58989ebe66ab6f4ca8b13a4b0a"} Oct 02 07:15:02 crc kubenswrapper[4786]: I1002 07:15:02.385223 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323155-zwqhb" Oct 02 07:15:02 crc kubenswrapper[4786]: I1002 07:15:02.499371 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5348404a-f352-4a0e-818c-d8633b951625-secret-volume\") pod \"5348404a-f352-4a0e-818c-d8633b951625\" (UID: \"5348404a-f352-4a0e-818c-d8633b951625\") " Oct 02 07:15:02 crc kubenswrapper[4786]: I1002 07:15:02.499738 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k575d\" (UniqueName: \"kubernetes.io/projected/5348404a-f352-4a0e-818c-d8633b951625-kube-api-access-k575d\") pod \"5348404a-f352-4a0e-818c-d8633b951625\" (UID: \"5348404a-f352-4a0e-818c-d8633b951625\") " Oct 02 07:15:02 crc kubenswrapper[4786]: I1002 07:15:02.499835 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5348404a-f352-4a0e-818c-d8633b951625-config-volume\") pod \"5348404a-f352-4a0e-818c-d8633b951625\" (UID: \"5348404a-f352-4a0e-818c-d8633b951625\") " Oct 02 07:15:02 crc kubenswrapper[4786]: I1002 07:15:02.500234 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5348404a-f352-4a0e-818c-d8633b951625-config-volume" (OuterVolumeSpecName: "config-volume") pod "5348404a-f352-4a0e-818c-d8633b951625" (UID: "5348404a-f352-4a0e-818c-d8633b951625"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:15:02 crc kubenswrapper[4786]: I1002 07:15:02.503846 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5348404a-f352-4a0e-818c-d8633b951625-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5348404a-f352-4a0e-818c-d8633b951625" (UID: "5348404a-f352-4a0e-818c-d8633b951625"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:15:02 crc kubenswrapper[4786]: I1002 07:15:02.503901 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5348404a-f352-4a0e-818c-d8633b951625-kube-api-access-k575d" (OuterVolumeSpecName: "kube-api-access-k575d") pod "5348404a-f352-4a0e-818c-d8633b951625" (UID: "5348404a-f352-4a0e-818c-d8633b951625"). InnerVolumeSpecName "kube-api-access-k575d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:15:02 crc kubenswrapper[4786]: I1002 07:15:02.602056 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k575d\" (UniqueName: \"kubernetes.io/projected/5348404a-f352-4a0e-818c-d8633b951625-kube-api-access-k575d\") on node \"crc\" DevicePath \"\"" Oct 02 07:15:02 crc kubenswrapper[4786]: I1002 07:15:02.602083 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5348404a-f352-4a0e-818c-d8633b951625-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 07:15:02 crc kubenswrapper[4786]: I1002 07:15:02.602093 4786 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5348404a-f352-4a0e-818c-d8633b951625-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 07:15:03 crc kubenswrapper[4786]: I1002 07:15:03.161230 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323155-zwqhb" event={"ID":"5348404a-f352-4a0e-818c-d8633b951625","Type":"ContainerDied","Data":"2fe53962608e62724bb6a475752556ed5be47901b00ffc49a3f1297ce7029a4a"} Oct 02 07:15:03 crc kubenswrapper[4786]: I1002 07:15:03.161253 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323155-zwqhb" Oct 02 07:15:03 crc kubenswrapper[4786]: I1002 07:15:03.161285 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fe53962608e62724bb6a475752556ed5be47901b00ffc49a3f1297ce7029a4a" Oct 02 07:15:03 crc kubenswrapper[4786]: I1002 07:15:03.465055 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" Oct 02 07:15:03 crc kubenswrapper[4786]: I1002 07:15:03.614964 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/483f48a9-eb90-4a5b-aaa6-63e130859f16-inventory\") pod \"483f48a9-eb90-4a5b-aaa6-63e130859f16\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " Oct 02 07:15:03 crc kubenswrapper[4786]: I1002 07:15:03.615015 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/483f48a9-eb90-4a5b-aaa6-63e130859f16-ovncontroller-config-0\") pod \"483f48a9-eb90-4a5b-aaa6-63e130859f16\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " Oct 02 07:15:03 crc kubenswrapper[4786]: I1002 07:15:03.615055 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/483f48a9-eb90-4a5b-aaa6-63e130859f16-ssh-key\") pod \"483f48a9-eb90-4a5b-aaa6-63e130859f16\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " Oct 02 07:15:03 crc kubenswrapper[4786]: I1002 07:15:03.615073 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh5ds\" (UniqueName: \"kubernetes.io/projected/483f48a9-eb90-4a5b-aaa6-63e130859f16-kube-api-access-wh5ds\") pod \"483f48a9-eb90-4a5b-aaa6-63e130859f16\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " Oct 02 07:15:03 crc kubenswrapper[4786]: I1002 07:15:03.615221 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483f48a9-eb90-4a5b-aaa6-63e130859f16-ovn-combined-ca-bundle\") pod \"483f48a9-eb90-4a5b-aaa6-63e130859f16\" (UID: \"483f48a9-eb90-4a5b-aaa6-63e130859f16\") " Oct 02 07:15:03 crc kubenswrapper[4786]: I1002 07:15:03.618953 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/483f48a9-eb90-4a5b-aaa6-63e130859f16-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "483f48a9-eb90-4a5b-aaa6-63e130859f16" (UID: "483f48a9-eb90-4a5b-aaa6-63e130859f16"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:15:03 crc kubenswrapper[4786]: I1002 07:15:03.620073 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/483f48a9-eb90-4a5b-aaa6-63e130859f16-kube-api-access-wh5ds" (OuterVolumeSpecName: "kube-api-access-wh5ds") pod "483f48a9-eb90-4a5b-aaa6-63e130859f16" (UID: "483f48a9-eb90-4a5b-aaa6-63e130859f16"). InnerVolumeSpecName "kube-api-access-wh5ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:15:03 crc kubenswrapper[4786]: I1002 07:15:03.633016 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/483f48a9-eb90-4a5b-aaa6-63e130859f16-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "483f48a9-eb90-4a5b-aaa6-63e130859f16" (UID: "483f48a9-eb90-4a5b-aaa6-63e130859f16"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:15:03 crc kubenswrapper[4786]: I1002 07:15:03.635450 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/483f48a9-eb90-4a5b-aaa6-63e130859f16-inventory" (OuterVolumeSpecName: "inventory") pod "483f48a9-eb90-4a5b-aaa6-63e130859f16" (UID: "483f48a9-eb90-4a5b-aaa6-63e130859f16"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:15:03 crc kubenswrapper[4786]: I1002 07:15:03.636320 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/483f48a9-eb90-4a5b-aaa6-63e130859f16-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "483f48a9-eb90-4a5b-aaa6-63e130859f16" (UID: "483f48a9-eb90-4a5b-aaa6-63e130859f16"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:15:03 crc kubenswrapper[4786]: I1002 07:15:03.717353 4786 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/483f48a9-eb90-4a5b-aaa6-63e130859f16-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 07:15:03 crc kubenswrapper[4786]: I1002 07:15:03.717616 4786 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/483f48a9-eb90-4a5b-aaa6-63e130859f16-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:15:03 crc kubenswrapper[4786]: I1002 07:15:03.717723 4786 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/483f48a9-eb90-4a5b-aaa6-63e130859f16-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 07:15:03 crc kubenswrapper[4786]: I1002 07:15:03.717805 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh5ds\" (UniqueName: \"kubernetes.io/projected/483f48a9-eb90-4a5b-aaa6-63e130859f16-kube-api-access-wh5ds\") on node \"crc\" DevicePath \"\"" Oct 02 07:15:03 crc kubenswrapper[4786]: I1002 07:15:03.717885 4786 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483f48a9-eb90-4a5b-aaa6-63e130859f16-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.167987 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" event={"ID":"483f48a9-eb90-4a5b-aaa6-63e130859f16","Type":"ContainerDied","Data":"4e71530d5e41b395504c8124cc60ad1e817a0015bed1d4ba595813fdae7446b5"} Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.168012 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-47j56" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.168022 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e71530d5e41b395504c8124cc60ad1e817a0015bed1d4ba595813fdae7446b5" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.236658 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt"] Oct 02 07:15:04 crc kubenswrapper[4786]: E1002 07:15:04.240062 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5348404a-f352-4a0e-818c-d8633b951625" containerName="collect-profiles" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.240097 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5348404a-f352-4a0e-818c-d8633b951625" containerName="collect-profiles" Oct 02 07:15:04 crc kubenswrapper[4786]: E1002 07:15:04.240120 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="483f48a9-eb90-4a5b-aaa6-63e130859f16" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.240128 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="483f48a9-eb90-4a5b-aaa6-63e130859f16" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.240362 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5348404a-f352-4a0e-818c-d8633b951625" containerName="collect-profiles" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.240386 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="483f48a9-eb90-4a5b-aaa6-63e130859f16" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.241031 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.247999 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.248140 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.248062 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hxdbp" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.248065 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.248465 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.248476 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.254381 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt"] Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.428364 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.429096 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.429190 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.429217 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.429257 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwf4p\" (UniqueName: \"kubernetes.io/projected/5e5e53e7-0f7d-4d7a-b410-364982cf5311-kube-api-access-vwf4p\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.429314 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.530525 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.530567 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwf4p\" (UniqueName: \"kubernetes.io/projected/5e5e53e7-0f7d-4d7a-b410-364982cf5311-kube-api-access-vwf4p\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.530590 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.530715 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.530762 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.530796 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.533795 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.533936 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.534081 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.534171 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.534733 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.544143 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwf4p\" (UniqueName: \"kubernetes.io/projected/5e5e53e7-0f7d-4d7a-b410-364982cf5311-kube-api-access-vwf4p\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.560305 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:04 crc kubenswrapper[4786]: I1002 07:15:04.968775 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt"] Oct 02 07:15:04 crc kubenswrapper[4786]: W1002 07:15:04.970950 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e5e53e7_0f7d_4d7a_b410_364982cf5311.slice/crio-7f6824f3f1411929de02ae395f3a58d3265029f2efe3ef5b635766b48040846f WatchSource:0}: Error finding container 7f6824f3f1411929de02ae395f3a58d3265029f2efe3ef5b635766b48040846f: Status 404 returned error can't find the container with id 7f6824f3f1411929de02ae395f3a58d3265029f2efe3ef5b635766b48040846f Oct 02 07:15:05 crc kubenswrapper[4786]: I1002 07:15:05.174263 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" event={"ID":"5e5e53e7-0f7d-4d7a-b410-364982cf5311","Type":"ContainerStarted","Data":"7f6824f3f1411929de02ae395f3a58d3265029f2efe3ef5b635766b48040846f"} Oct 02 07:15:06 crc kubenswrapper[4786]: I1002 07:15:06.186095 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" event={"ID":"5e5e53e7-0f7d-4d7a-b410-364982cf5311","Type":"ContainerStarted","Data":"4ad60e05700ae7b38108c226081076c9bda79a7f17916f9d1eefd4783e2dc5ea"} Oct 02 07:15:06 crc kubenswrapper[4786]: I1002 07:15:06.198260 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" podStartSLOduration=1.669300319 podStartE2EDuration="2.198244728s" podCreationTimestamp="2025-10-02 07:15:04 +0000 UTC" firstStartedPulling="2025-10-02 07:15:04.973159614 +0000 UTC m=+1715.094342745" lastFinishedPulling="2025-10-02 07:15:05.502104023 +0000 UTC m=+1715.623287154" observedRunningTime="2025-10-02 07:15:06.19575876 +0000 UTC m=+1716.316941891" watchObservedRunningTime="2025-10-02 07:15:06.198244728 +0000 UTC m=+1716.319427858" Oct 02 07:15:07 crc kubenswrapper[4786]: I1002 07:15:07.179909 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:15:07 crc kubenswrapper[4786]: E1002 07:15:07.180358 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:15:18 crc kubenswrapper[4786]: I1002 07:15:18.179684 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:15:18 crc kubenswrapper[4786]: E1002 07:15:18.180957 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:15:32 crc kubenswrapper[4786]: I1002 07:15:32.179123 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:15:32 crc kubenswrapper[4786]: E1002 07:15:32.180242 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:15:38 crc kubenswrapper[4786]: I1002 07:15:38.383303 4786 generic.go:334] "Generic (PLEG): container finished" podID="5e5e53e7-0f7d-4d7a-b410-364982cf5311" containerID="4ad60e05700ae7b38108c226081076c9bda79a7f17916f9d1eefd4783e2dc5ea" exitCode=0 Oct 02 07:15:38 crc kubenswrapper[4786]: I1002 07:15:38.383368 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" event={"ID":"5e5e53e7-0f7d-4d7a-b410-364982cf5311","Type":"ContainerDied","Data":"4ad60e05700ae7b38108c226081076c9bda79a7f17916f9d1eefd4783e2dc5ea"} Oct 02 07:15:39 crc kubenswrapper[4786]: I1002 07:15:39.686793 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:39 crc kubenswrapper[4786]: I1002 07:15:39.801846 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwf4p\" (UniqueName: \"kubernetes.io/projected/5e5e53e7-0f7d-4d7a-b410-364982cf5311-kube-api-access-vwf4p\") pod \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " Oct 02 07:15:39 crc kubenswrapper[4786]: I1002 07:15:39.801910 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-nova-metadata-neutron-config-0\") pod \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " Oct 02 07:15:39 crc kubenswrapper[4786]: I1002 07:15:39.802097 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-ssh-key\") pod \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " Oct 02 07:15:39 crc kubenswrapper[4786]: I1002 07:15:39.802148 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-neutron-ovn-metadata-agent-neutron-config-0\") pod \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " Oct 02 07:15:39 crc kubenswrapper[4786]: I1002 07:15:39.802224 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-neutron-metadata-combined-ca-bundle\") pod \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " Oct 02 07:15:39 crc kubenswrapper[4786]: I1002 07:15:39.802245 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-inventory\") pod \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\" (UID: \"5e5e53e7-0f7d-4d7a-b410-364982cf5311\") " Oct 02 07:15:39 crc kubenswrapper[4786]: I1002 07:15:39.806483 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "5e5e53e7-0f7d-4d7a-b410-364982cf5311" (UID: "5e5e53e7-0f7d-4d7a-b410-364982cf5311"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:15:39 crc kubenswrapper[4786]: I1002 07:15:39.806778 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e5e53e7-0f7d-4d7a-b410-364982cf5311-kube-api-access-vwf4p" (OuterVolumeSpecName: "kube-api-access-vwf4p") pod "5e5e53e7-0f7d-4d7a-b410-364982cf5311" (UID: "5e5e53e7-0f7d-4d7a-b410-364982cf5311"). InnerVolumeSpecName "kube-api-access-vwf4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:15:39 crc kubenswrapper[4786]: I1002 07:15:39.823059 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5e5e53e7-0f7d-4d7a-b410-364982cf5311" (UID: "5e5e53e7-0f7d-4d7a-b410-364982cf5311"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:15:39 crc kubenswrapper[4786]: I1002 07:15:39.824364 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-inventory" (OuterVolumeSpecName: "inventory") pod "5e5e53e7-0f7d-4d7a-b410-364982cf5311" (UID: "5e5e53e7-0f7d-4d7a-b410-364982cf5311"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:15:39 crc kubenswrapper[4786]: I1002 07:15:39.824741 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "5e5e53e7-0f7d-4d7a-b410-364982cf5311" (UID: "5e5e53e7-0f7d-4d7a-b410-364982cf5311"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:15:39 crc kubenswrapper[4786]: I1002 07:15:39.824972 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "5e5e53e7-0f7d-4d7a-b410-364982cf5311" (UID: "5e5e53e7-0f7d-4d7a-b410-364982cf5311"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:15:39 crc kubenswrapper[4786]: I1002 07:15:39.904958 4786 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:15:39 crc kubenswrapper[4786]: I1002 07:15:39.904989 4786 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 07:15:39 crc kubenswrapper[4786]: I1002 07:15:39.905001 4786 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:15:39 crc kubenswrapper[4786]: I1002 07:15:39.905013 4786 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:15:39 crc kubenswrapper[4786]: I1002 07:15:39.905023 4786 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e5e53e7-0f7d-4d7a-b410-364982cf5311-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 07:15:39 crc kubenswrapper[4786]: I1002 07:15:39.905031 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwf4p\" (UniqueName: \"kubernetes.io/projected/5e5e53e7-0f7d-4d7a-b410-364982cf5311-kube-api-access-vwf4p\") on node \"crc\" DevicePath \"\"" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.396114 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" event={"ID":"5e5e53e7-0f7d-4d7a-b410-364982cf5311","Type":"ContainerDied","Data":"7f6824f3f1411929de02ae395f3a58d3265029f2efe3ef5b635766b48040846f"} Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.396173 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f6824f3f1411929de02ae395f3a58d3265029f2efe3ef5b635766b48040846f" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.396208 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.498387 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7"] Oct 02 07:15:40 crc kubenswrapper[4786]: E1002 07:15:40.498806 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e5e53e7-0f7d-4d7a-b410-364982cf5311" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.498826 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e5e53e7-0f7d-4d7a-b410-364982cf5311" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.499025 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e5e53e7-0f7d-4d7a-b410-364982cf5311" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.499623 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.501757 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.501781 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.501854 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.501770 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.502593 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hxdbp" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.505301 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7"] Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.517932 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.518052 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.518131 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.518166 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.518201 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wtqs\" (UniqueName: \"kubernetes.io/projected/f597e524-6913-4445-ac68-00fb20d044b8-kube-api-access-7wtqs\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.619524 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.619586 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.619624 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wtqs\" (UniqueName: \"kubernetes.io/projected/f597e524-6913-4445-ac68-00fb20d044b8-kube-api-access-7wtqs\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.619753 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.619859 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.623738 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.623738 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.624594 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.624855 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.632379 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wtqs\" (UniqueName: \"kubernetes.io/projected/f597e524-6913-4445-ac68-00fb20d044b8-kube-api-access-7wtqs\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" Oct 02 07:15:40 crc kubenswrapper[4786]: I1002 07:15:40.811663 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" Oct 02 07:15:41 crc kubenswrapper[4786]: I1002 07:15:41.317855 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7"] Oct 02 07:15:41 crc kubenswrapper[4786]: I1002 07:15:41.404376 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" event={"ID":"f597e524-6913-4445-ac68-00fb20d044b8","Type":"ContainerStarted","Data":"30c843295ba8693a0ed39fd9bf3f6a35ffa75b7ca7cebef590fc65494333a2e0"} Oct 02 07:15:42 crc kubenswrapper[4786]: I1002 07:15:42.411825 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" event={"ID":"f597e524-6913-4445-ac68-00fb20d044b8","Type":"ContainerStarted","Data":"5220b29fadf4546ab0ed103d02f7b35e41b36cb62f95809f632a96ceb69bac35"} Oct 02 07:15:42 crc kubenswrapper[4786]: I1002 07:15:42.428189 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" podStartSLOduration=1.946665256 podStartE2EDuration="2.428175526s" podCreationTimestamp="2025-10-02 07:15:40 +0000 UTC" firstStartedPulling="2025-10-02 07:15:41.322224452 +0000 UTC m=+1751.443407583" lastFinishedPulling="2025-10-02 07:15:41.803734721 +0000 UTC m=+1751.924917853" observedRunningTime="2025-10-02 07:15:42.423966971 +0000 UTC m=+1752.545150123" watchObservedRunningTime="2025-10-02 07:15:42.428175526 +0000 UTC m=+1752.549358657" Oct 02 07:15:47 crc kubenswrapper[4786]: I1002 07:15:47.179260 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:15:47 crc kubenswrapper[4786]: E1002 07:15:47.179823 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:16:01 crc kubenswrapper[4786]: I1002 07:16:01.179301 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:16:01 crc kubenswrapper[4786]: E1002 07:16:01.180121 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:16:16 crc kubenswrapper[4786]: I1002 07:16:16.179404 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:16:16 crc kubenswrapper[4786]: E1002 07:16:16.180131 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:16:29 crc kubenswrapper[4786]: I1002 07:16:29.179250 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:16:29 crc kubenswrapper[4786]: E1002 07:16:29.179818 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:16:42 crc kubenswrapper[4786]: I1002 07:16:42.178991 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:16:42 crc kubenswrapper[4786]: E1002 07:16:42.179564 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:16:57 crc kubenswrapper[4786]: I1002 07:16:57.179317 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:16:57 crc kubenswrapper[4786]: E1002 07:16:57.180017 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:17:10 crc kubenswrapper[4786]: I1002 07:17:10.183540 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:17:10 crc kubenswrapper[4786]: I1002 07:17:10.995142 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerStarted","Data":"265eb364681853ffc77ef3320cccb340344ef4bcb3e650d0e6862bf07933d860"} Oct 02 07:18:35 crc kubenswrapper[4786]: I1002 07:18:35.527186 4786 generic.go:334] "Generic (PLEG): container finished" podID="f597e524-6913-4445-ac68-00fb20d044b8" containerID="5220b29fadf4546ab0ed103d02f7b35e41b36cb62f95809f632a96ceb69bac35" exitCode=0 Oct 02 07:18:35 crc kubenswrapper[4786]: I1002 07:18:35.527277 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" event={"ID":"f597e524-6913-4445-ac68-00fb20d044b8","Type":"ContainerDied","Data":"5220b29fadf4546ab0ed103d02f7b35e41b36cb62f95809f632a96ceb69bac35"} Oct 02 07:18:36 crc kubenswrapper[4786]: I1002 07:18:36.842321 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" Oct 02 07:18:36 crc kubenswrapper[4786]: I1002 07:18:36.901009 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-inventory\") pod \"f597e524-6913-4445-ac68-00fb20d044b8\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " Oct 02 07:18:36 crc kubenswrapper[4786]: I1002 07:18:36.901461 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-libvirt-secret-0\") pod \"f597e524-6913-4445-ac68-00fb20d044b8\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " Oct 02 07:18:36 crc kubenswrapper[4786]: I1002 07:18:36.901502 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-libvirt-combined-ca-bundle\") pod \"f597e524-6913-4445-ac68-00fb20d044b8\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " Oct 02 07:18:36 crc kubenswrapper[4786]: I1002 07:18:36.901600 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wtqs\" (UniqueName: \"kubernetes.io/projected/f597e524-6913-4445-ac68-00fb20d044b8-kube-api-access-7wtqs\") pod \"f597e524-6913-4445-ac68-00fb20d044b8\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " Oct 02 07:18:36 crc kubenswrapper[4786]: I1002 07:18:36.901663 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-ssh-key\") pod \"f597e524-6913-4445-ac68-00fb20d044b8\" (UID: \"f597e524-6913-4445-ac68-00fb20d044b8\") " Oct 02 07:18:36 crc kubenswrapper[4786]: I1002 07:18:36.907087 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f597e524-6913-4445-ac68-00fb20d044b8-kube-api-access-7wtqs" (OuterVolumeSpecName: "kube-api-access-7wtqs") pod "f597e524-6913-4445-ac68-00fb20d044b8" (UID: "f597e524-6913-4445-ac68-00fb20d044b8"). InnerVolumeSpecName "kube-api-access-7wtqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:18:36 crc kubenswrapper[4786]: I1002 07:18:36.907231 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "f597e524-6913-4445-ac68-00fb20d044b8" (UID: "f597e524-6913-4445-ac68-00fb20d044b8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:18:36 crc kubenswrapper[4786]: I1002 07:18:36.924060 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "f597e524-6913-4445-ac68-00fb20d044b8" (UID: "f597e524-6913-4445-ac68-00fb20d044b8"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:18:36 crc kubenswrapper[4786]: I1002 07:18:36.924343 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f597e524-6913-4445-ac68-00fb20d044b8" (UID: "f597e524-6913-4445-ac68-00fb20d044b8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:18:36 crc kubenswrapper[4786]: I1002 07:18:36.925292 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-inventory" (OuterVolumeSpecName: "inventory") pod "f597e524-6913-4445-ac68-00fb20d044b8" (UID: "f597e524-6913-4445-ac68-00fb20d044b8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.003817 4786 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.003847 4786 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.003872 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wtqs\" (UniqueName: \"kubernetes.io/projected/f597e524-6913-4445-ac68-00fb20d044b8-kube-api-access-7wtqs\") on node \"crc\" DevicePath \"\"" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.003880 4786 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.003889 4786 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f597e524-6913-4445-ac68-00fb20d044b8-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.542003 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" event={"ID":"f597e524-6913-4445-ac68-00fb20d044b8","Type":"ContainerDied","Data":"30c843295ba8693a0ed39fd9bf3f6a35ffa75b7ca7cebef590fc65494333a2e0"} Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.542046 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30c843295ba8693a0ed39fd9bf3f6a35ffa75b7ca7cebef590fc65494333a2e0" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.542095 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.608805 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm"] Oct 02 07:18:37 crc kubenswrapper[4786]: E1002 07:18:37.609481 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f597e524-6913-4445-ac68-00fb20d044b8" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.609503 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="f597e524-6913-4445-ac68-00fb20d044b8" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.609734 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="f597e524-6913-4445-ac68-00fb20d044b8" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.610282 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.616107 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.616145 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.618293 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.618547 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.619158 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.619283 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.619545 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hxdbp" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.619901 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm"] Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.712140 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.712202 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7pxw\" (UniqueName: \"kubernetes.io/projected/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-kube-api-access-r7pxw\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.712314 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.712381 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.712441 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.712510 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.712533 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.712581 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.712639 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.813299 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.813335 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7pxw\" (UniqueName: \"kubernetes.io/projected/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-kube-api-access-r7pxw\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.813375 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.813410 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.813439 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.813471 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.813498 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.813527 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.813557 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.814334 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.816939 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.817297 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.817305 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.817802 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.818198 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.818722 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.818847 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.827789 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7pxw\" (UniqueName: \"kubernetes.io/projected/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-kube-api-access-r7pxw\") pod \"nova-edpm-deployment-openstack-edpm-ipam-qnbsm\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:37 crc kubenswrapper[4786]: I1002 07:18:37.927233 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:18:38 crc kubenswrapper[4786]: I1002 07:18:38.353046 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm"] Oct 02 07:18:38 crc kubenswrapper[4786]: I1002 07:18:38.357637 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 07:18:38 crc kubenswrapper[4786]: I1002 07:18:38.549357 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" event={"ID":"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e","Type":"ContainerStarted","Data":"1df5ce458b73ff2fa523356bfe487a03797a5dd6456345c03259fbd1a371ef85"} Oct 02 07:18:39 crc kubenswrapper[4786]: I1002 07:18:39.557271 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" event={"ID":"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e","Type":"ContainerStarted","Data":"112776044faeba60199d663a10a07c61d20c470023530ec4ab9aa6bd5f45023a"} Oct 02 07:18:39 crc kubenswrapper[4786]: I1002 07:18:39.574069 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" podStartSLOduration=2.117792665 podStartE2EDuration="2.574052281s" podCreationTimestamp="2025-10-02 07:18:37 +0000 UTC" firstStartedPulling="2025-10-02 07:18:38.357413695 +0000 UTC m=+1928.478596827" lastFinishedPulling="2025-10-02 07:18:38.813673312 +0000 UTC m=+1928.934856443" observedRunningTime="2025-10-02 07:18:39.569499436 +0000 UTC m=+1929.690682577" watchObservedRunningTime="2025-10-02 07:18:39.574052281 +0000 UTC m=+1929.695235412" Oct 02 07:19:27 crc kubenswrapper[4786]: I1002 07:19:27.497744 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:19:27 crc kubenswrapper[4786]: I1002 07:19:27.498173 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:19:57 crc kubenswrapper[4786]: I1002 07:19:57.497339 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:19:57 crc kubenswrapper[4786]: I1002 07:19:57.498766 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:20:27 crc kubenswrapper[4786]: I1002 07:20:27.497619 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:20:27 crc kubenswrapper[4786]: I1002 07:20:27.498076 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:20:27 crc kubenswrapper[4786]: I1002 07:20:27.498119 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 07:20:27 crc kubenswrapper[4786]: I1002 07:20:27.498716 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"265eb364681853ffc77ef3320cccb340344ef4bcb3e650d0e6862bf07933d860"} pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 07:20:27 crc kubenswrapper[4786]: I1002 07:20:27.498763 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" containerID="cri-o://265eb364681853ffc77ef3320cccb340344ef4bcb3e650d0e6862bf07933d860" gracePeriod=600 Oct 02 07:20:28 crc kubenswrapper[4786]: I1002 07:20:28.245526 4786 generic.go:334] "Generic (PLEG): container finished" podID="79cb22df-4930-4aed-9108-1056074d1000" containerID="265eb364681853ffc77ef3320cccb340344ef4bcb3e650d0e6862bf07933d860" exitCode=0 Oct 02 07:20:28 crc kubenswrapper[4786]: I1002 07:20:28.245596 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerDied","Data":"265eb364681853ffc77ef3320cccb340344ef4bcb3e650d0e6862bf07933d860"} Oct 02 07:20:28 crc kubenswrapper[4786]: I1002 07:20:28.245768 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerStarted","Data":"5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5"} Oct 02 07:20:28 crc kubenswrapper[4786]: I1002 07:20:28.245795 4786 scope.go:117] "RemoveContainer" containerID="0921ca0a793e40b1dafc111b301c6fdb41f303d4fc81ea756fc5324e71db2921" Oct 02 07:20:47 crc kubenswrapper[4786]: I1002 07:20:47.369816 4786 generic.go:334] "Generic (PLEG): container finished" podID="0378af36-7eb4-4ef7-bf6a-dc2bd678c31e" containerID="112776044faeba60199d663a10a07c61d20c470023530ec4ab9aa6bd5f45023a" exitCode=0 Oct 02 07:20:47 crc kubenswrapper[4786]: I1002 07:20:47.369901 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" event={"ID":"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e","Type":"ContainerDied","Data":"112776044faeba60199d663a10a07c61d20c470023530ec4ab9aa6bd5f45023a"} Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.674286 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.748239 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-migration-ssh-key-0\") pod \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.748309 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-inventory\") pod \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.748363 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-migration-ssh-key-1\") pod \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.748399 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-combined-ca-bundle\") pod \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.748442 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-ssh-key\") pod \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.748466 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7pxw\" (UniqueName: \"kubernetes.io/projected/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-kube-api-access-r7pxw\") pod \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.748553 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-cell1-compute-config-1\") pod \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.748580 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-extra-config-0\") pod \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.748609 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-cell1-compute-config-0\") pod \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\" (UID: \"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e\") " Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.756034 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-kube-api-access-r7pxw" (OuterVolumeSpecName: "kube-api-access-r7pxw") pod "0378af36-7eb4-4ef7-bf6a-dc2bd678c31e" (UID: "0378af36-7eb4-4ef7-bf6a-dc2bd678c31e"). InnerVolumeSpecName "kube-api-access-r7pxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.763967 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "0378af36-7eb4-4ef7-bf6a-dc2bd678c31e" (UID: "0378af36-7eb4-4ef7-bf6a-dc2bd678c31e"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.771716 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "0378af36-7eb4-4ef7-bf6a-dc2bd678c31e" (UID: "0378af36-7eb4-4ef7-bf6a-dc2bd678c31e"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.771875 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-inventory" (OuterVolumeSpecName: "inventory") pod "0378af36-7eb4-4ef7-bf6a-dc2bd678c31e" (UID: "0378af36-7eb4-4ef7-bf6a-dc2bd678c31e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.774064 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "0378af36-7eb4-4ef7-bf6a-dc2bd678c31e" (UID: "0378af36-7eb4-4ef7-bf6a-dc2bd678c31e"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.774495 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "0378af36-7eb4-4ef7-bf6a-dc2bd678c31e" (UID: "0378af36-7eb4-4ef7-bf6a-dc2bd678c31e"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.775275 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "0378af36-7eb4-4ef7-bf6a-dc2bd678c31e" (UID: "0378af36-7eb4-4ef7-bf6a-dc2bd678c31e"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.775358 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "0378af36-7eb4-4ef7-bf6a-dc2bd678c31e" (UID: "0378af36-7eb4-4ef7-bf6a-dc2bd678c31e"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.779801 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0378af36-7eb4-4ef7-bf6a-dc2bd678c31e" (UID: "0378af36-7eb4-4ef7-bf6a-dc2bd678c31e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.850638 4786 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.850670 4786 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.850680 4786 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.850708 4786 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.850720 4786 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.850727 4786 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.850736 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7pxw\" (UniqueName: \"kubernetes.io/projected/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-kube-api-access-r7pxw\") on node \"crc\" DevicePath \"\"" Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.850744 4786 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 02 07:20:48 crc kubenswrapper[4786]: I1002 07:20:48.850753 4786 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0378af36-7eb4-4ef7-bf6a-dc2bd678c31e-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.384997 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" event={"ID":"0378af36-7eb4-4ef7-bf6a-dc2bd678c31e","Type":"ContainerDied","Data":"1df5ce458b73ff2fa523356bfe487a03797a5dd6456345c03259fbd1a371ef85"} Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.385035 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1df5ce458b73ff2fa523356bfe487a03797a5dd6456345c03259fbd1a371ef85" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.385037 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-qnbsm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.454047 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm"] Oct 02 07:20:49 crc kubenswrapper[4786]: E1002 07:20:49.454398 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0378af36-7eb4-4ef7-bf6a-dc2bd678c31e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.454418 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="0378af36-7eb4-4ef7-bf6a-dc2bd678c31e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.454606 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="0378af36-7eb4-4ef7-bf6a-dc2bd678c31e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.455152 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.456554 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.456808 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.457547 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.457642 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-hxdbp" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.457803 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.468382 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm"] Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.559045 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.559123 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.559215 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szj79\" (UniqueName: \"kubernetes.io/projected/b470bf8f-a980-45f7-b488-eda5005abf20-kube-api-access-szj79\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.559491 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.559535 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.559568 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.559622 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.660789 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.660825 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.660845 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.660868 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.660911 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.660957 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.660981 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szj79\" (UniqueName: \"kubernetes.io/projected/b470bf8f-a980-45f7-b488-eda5005abf20-kube-api-access-szj79\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.664725 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.664876 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.665589 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.665809 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.666518 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.666766 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.673788 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szj79\" (UniqueName: \"kubernetes.io/projected/b470bf8f-a980-45f7-b488-eda5005abf20-kube-api-access-szj79\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:49 crc kubenswrapper[4786]: I1002 07:20:49.773170 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:20:50 crc kubenswrapper[4786]: I1002 07:20:50.196031 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm"] Oct 02 07:20:50 crc kubenswrapper[4786]: I1002 07:20:50.391797 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" event={"ID":"b470bf8f-a980-45f7-b488-eda5005abf20","Type":"ContainerStarted","Data":"2cfd96bef225cca3a5a4796f1c4c3814bac72742a56b837de613b2e50653bf49"} Oct 02 07:20:51 crc kubenswrapper[4786]: I1002 07:20:51.399545 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" event={"ID":"b470bf8f-a980-45f7-b488-eda5005abf20","Type":"ContainerStarted","Data":"fd281481b8ecb7cbf015e8d716da4bfc3e59aed9037018f387627d8c164efce9"} Oct 02 07:20:51 crc kubenswrapper[4786]: I1002 07:20:51.414385 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" podStartSLOduration=1.890955961 podStartE2EDuration="2.414369056s" podCreationTimestamp="2025-10-02 07:20:49 +0000 UTC" firstStartedPulling="2025-10-02 07:20:50.199045411 +0000 UTC m=+2060.320228542" lastFinishedPulling="2025-10-02 07:20:50.722458506 +0000 UTC m=+2060.843641637" observedRunningTime="2025-10-02 07:20:51.412857828 +0000 UTC m=+2061.534040969" watchObservedRunningTime="2025-10-02 07:20:51.414369056 +0000 UTC m=+2061.535552188" Oct 02 07:22:23 crc kubenswrapper[4786]: E1002 07:22:23.498074 4786 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb470bf8f_a980_45f7_b488_eda5005abf20.slice/crio-fd281481b8ecb7cbf015e8d716da4bfc3e59aed9037018f387627d8c164efce9.scope\": RecentStats: unable to find data in memory cache]" Oct 02 07:22:23 crc kubenswrapper[4786]: I1002 07:22:23.927841 4786 generic.go:334] "Generic (PLEG): container finished" podID="b470bf8f-a980-45f7-b488-eda5005abf20" containerID="fd281481b8ecb7cbf015e8d716da4bfc3e59aed9037018f387627d8c164efce9" exitCode=0 Oct 02 07:22:23 crc kubenswrapper[4786]: I1002 07:22:23.927932 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" event={"ID":"b470bf8f-a980-45f7-b488-eda5005abf20","Type":"ContainerDied","Data":"fd281481b8ecb7cbf015e8d716da4bfc3e59aed9037018f387627d8c164efce9"} Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.232053 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.261591 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ceilometer-compute-config-data-0\") pod \"b470bf8f-a980-45f7-b488-eda5005abf20\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.261637 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-inventory\") pod \"b470bf8f-a980-45f7-b488-eda5005abf20\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.261655 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ssh-key\") pod \"b470bf8f-a980-45f7-b488-eda5005abf20\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.261716 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-telemetry-combined-ca-bundle\") pod \"b470bf8f-a980-45f7-b488-eda5005abf20\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.261739 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ceilometer-compute-config-data-1\") pod \"b470bf8f-a980-45f7-b488-eda5005abf20\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.261788 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ceilometer-compute-config-data-2\") pod \"b470bf8f-a980-45f7-b488-eda5005abf20\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.261859 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szj79\" (UniqueName: \"kubernetes.io/projected/b470bf8f-a980-45f7-b488-eda5005abf20-kube-api-access-szj79\") pod \"b470bf8f-a980-45f7-b488-eda5005abf20\" (UID: \"b470bf8f-a980-45f7-b488-eda5005abf20\") " Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.269542 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "b470bf8f-a980-45f7-b488-eda5005abf20" (UID: "b470bf8f-a980-45f7-b488-eda5005abf20"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.269762 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b470bf8f-a980-45f7-b488-eda5005abf20-kube-api-access-szj79" (OuterVolumeSpecName: "kube-api-access-szj79") pod "b470bf8f-a980-45f7-b488-eda5005abf20" (UID: "b470bf8f-a980-45f7-b488-eda5005abf20"). InnerVolumeSpecName "kube-api-access-szj79". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.283071 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "b470bf8f-a980-45f7-b488-eda5005abf20" (UID: "b470bf8f-a980-45f7-b488-eda5005abf20"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.283090 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b470bf8f-a980-45f7-b488-eda5005abf20" (UID: "b470bf8f-a980-45f7-b488-eda5005abf20"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.283946 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-inventory" (OuterVolumeSpecName: "inventory") pod "b470bf8f-a980-45f7-b488-eda5005abf20" (UID: "b470bf8f-a980-45f7-b488-eda5005abf20"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.283967 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "b470bf8f-a980-45f7-b488-eda5005abf20" (UID: "b470bf8f-a980-45f7-b488-eda5005abf20"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.284285 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "b470bf8f-a980-45f7-b488-eda5005abf20" (UID: "b470bf8f-a980-45f7-b488-eda5005abf20"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.363592 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szj79\" (UniqueName: \"kubernetes.io/projected/b470bf8f-a980-45f7-b488-eda5005abf20-kube-api-access-szj79\") on node \"crc\" DevicePath \"\"" Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.363621 4786 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.363632 4786 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.363643 4786 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.363652 4786 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.363661 4786 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.363669 4786 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b470bf8f-a980-45f7-b488-eda5005abf20-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.941497 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" event={"ID":"b470bf8f-a980-45f7-b488-eda5005abf20","Type":"ContainerDied","Data":"2cfd96bef225cca3a5a4796f1c4c3814bac72742a56b837de613b2e50653bf49"} Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.941725 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cfd96bef225cca3a5a4796f1c4c3814bac72742a56b837de613b2e50653bf49" Oct 02 07:22:25 crc kubenswrapper[4786]: I1002 07:22:25.941540 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm" Oct 02 07:22:27 crc kubenswrapper[4786]: I1002 07:22:27.497716 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:22:27 crc kubenswrapper[4786]: I1002 07:22:27.497773 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:22:57 crc kubenswrapper[4786]: I1002 07:22:57.497194 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:22:57 crc kubenswrapper[4786]: I1002 07:22:57.497557 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.338537 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 02 07:23:10 crc kubenswrapper[4786]: E1002 07:23:10.339286 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b470bf8f-a980-45f7-b488-eda5005abf20" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.339301 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="b470bf8f-a980-45f7-b488-eda5005abf20" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.339518 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="b470bf8f-a980-45f7-b488-eda5005abf20" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.340073 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.341421 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.341714 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6n4v6" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.341769 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.342192 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.344303 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.411927 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-config-data\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.411966 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.412288 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.514292 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.514340 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.514471 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.514516 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.514739 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-config-data\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.514778 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.514825 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.514848 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.514999 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpkjr\" (UniqueName: \"kubernetes.io/projected/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-kube-api-access-hpkjr\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.515744 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-config-data\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.515995 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.518777 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.616007 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.616061 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.616146 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpkjr\" (UniqueName: \"kubernetes.io/projected/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-kube-api-access-hpkjr\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.616187 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.616215 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.616246 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.616342 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.616599 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.616666 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.620130 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.620965 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.628651 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpkjr\" (UniqueName: \"kubernetes.io/projected/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-kube-api-access-hpkjr\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.634831 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " pod="openstack/tempest-tests-tempest" Oct 02 07:23:10 crc kubenswrapper[4786]: I1002 07:23:10.654913 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 02 07:23:11 crc kubenswrapper[4786]: I1002 07:23:11.008272 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 02 07:23:11 crc kubenswrapper[4786]: I1002 07:23:11.218575 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9","Type":"ContainerStarted","Data":"77d9babc089bb332e17913cf1d4f567ecaaca3f8e20714af339acb906e0c0065"} Oct 02 07:23:27 crc kubenswrapper[4786]: I1002 07:23:27.497397 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:23:27 crc kubenswrapper[4786]: I1002 07:23:27.497853 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:23:27 crc kubenswrapper[4786]: I1002 07:23:27.497889 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 07:23:27 crc kubenswrapper[4786]: I1002 07:23:27.498274 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5"} pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 07:23:27 crc kubenswrapper[4786]: I1002 07:23:27.498319 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" containerID="cri-o://5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" gracePeriod=600 Oct 02 07:23:27 crc kubenswrapper[4786]: E1002 07:23:27.613219 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:23:28 crc kubenswrapper[4786]: I1002 07:23:28.332714 4786 generic.go:334] "Generic (PLEG): container finished" podID="79cb22df-4930-4aed-9108-1056074d1000" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" exitCode=0 Oct 02 07:23:28 crc kubenswrapper[4786]: I1002 07:23:28.332715 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerDied","Data":"5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5"} Oct 02 07:23:28 crc kubenswrapper[4786]: I1002 07:23:28.332940 4786 scope.go:117] "RemoveContainer" containerID="265eb364681853ffc77ef3320cccb340344ef4bcb3e650d0e6862bf07933d860" Oct 02 07:23:28 crc kubenswrapper[4786]: I1002 07:23:28.333361 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:23:28 crc kubenswrapper[4786]: E1002 07:23:28.333636 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:23:39 crc kubenswrapper[4786]: I1002 07:23:39.027049 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9rtqd"] Oct 02 07:23:39 crc kubenswrapper[4786]: I1002 07:23:39.030427 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rtqd" Oct 02 07:23:39 crc kubenswrapper[4786]: I1002 07:23:39.049860 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rtqd"] Oct 02 07:23:39 crc kubenswrapper[4786]: I1002 07:23:39.070059 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a18c-9a73-4638-b581-3f435abe1fbe-catalog-content\") pod \"redhat-marketplace-9rtqd\" (UID: \"fbf0a18c-9a73-4638-b581-3f435abe1fbe\") " pod="openshift-marketplace/redhat-marketplace-9rtqd" Oct 02 07:23:39 crc kubenswrapper[4786]: I1002 07:23:39.070121 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pn9j\" (UniqueName: \"kubernetes.io/projected/fbf0a18c-9a73-4638-b581-3f435abe1fbe-kube-api-access-8pn9j\") pod \"redhat-marketplace-9rtqd\" (UID: \"fbf0a18c-9a73-4638-b581-3f435abe1fbe\") " pod="openshift-marketplace/redhat-marketplace-9rtqd" Oct 02 07:23:39 crc kubenswrapper[4786]: I1002 07:23:39.070195 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a18c-9a73-4638-b581-3f435abe1fbe-utilities\") pod \"redhat-marketplace-9rtqd\" (UID: \"fbf0a18c-9a73-4638-b581-3f435abe1fbe\") " pod="openshift-marketplace/redhat-marketplace-9rtqd" Oct 02 07:23:39 crc kubenswrapper[4786]: I1002 07:23:39.171827 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a18c-9a73-4638-b581-3f435abe1fbe-catalog-content\") pod \"redhat-marketplace-9rtqd\" (UID: \"fbf0a18c-9a73-4638-b581-3f435abe1fbe\") " pod="openshift-marketplace/redhat-marketplace-9rtqd" Oct 02 07:23:39 crc kubenswrapper[4786]: I1002 07:23:39.172028 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pn9j\" (UniqueName: \"kubernetes.io/projected/fbf0a18c-9a73-4638-b581-3f435abe1fbe-kube-api-access-8pn9j\") pod \"redhat-marketplace-9rtqd\" (UID: \"fbf0a18c-9a73-4638-b581-3f435abe1fbe\") " pod="openshift-marketplace/redhat-marketplace-9rtqd" Oct 02 07:23:39 crc kubenswrapper[4786]: I1002 07:23:39.172230 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a18c-9a73-4638-b581-3f435abe1fbe-catalog-content\") pod \"redhat-marketplace-9rtqd\" (UID: \"fbf0a18c-9a73-4638-b581-3f435abe1fbe\") " pod="openshift-marketplace/redhat-marketplace-9rtqd" Oct 02 07:23:39 crc kubenswrapper[4786]: I1002 07:23:39.172235 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a18c-9a73-4638-b581-3f435abe1fbe-utilities\") pod \"redhat-marketplace-9rtqd\" (UID: \"fbf0a18c-9a73-4638-b581-3f435abe1fbe\") " pod="openshift-marketplace/redhat-marketplace-9rtqd" Oct 02 07:23:39 crc kubenswrapper[4786]: I1002 07:23:39.172437 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a18c-9a73-4638-b581-3f435abe1fbe-utilities\") pod \"redhat-marketplace-9rtqd\" (UID: \"fbf0a18c-9a73-4638-b581-3f435abe1fbe\") " pod="openshift-marketplace/redhat-marketplace-9rtqd" Oct 02 07:23:39 crc kubenswrapper[4786]: I1002 07:23:39.189988 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pn9j\" (UniqueName: \"kubernetes.io/projected/fbf0a18c-9a73-4638-b581-3f435abe1fbe-kube-api-access-8pn9j\") pod \"redhat-marketplace-9rtqd\" (UID: \"fbf0a18c-9a73-4638-b581-3f435abe1fbe\") " pod="openshift-marketplace/redhat-marketplace-9rtqd" Oct 02 07:23:39 crc kubenswrapper[4786]: I1002 07:23:39.349927 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rtqd" Oct 02 07:23:39 crc kubenswrapper[4786]: I1002 07:23:39.715373 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rtqd"] Oct 02 07:23:40 crc kubenswrapper[4786]: I1002 07:23:40.427967 4786 generic.go:334] "Generic (PLEG): container finished" podID="fbf0a18c-9a73-4638-b581-3f435abe1fbe" containerID="0191b0e856ca0d3fc29949c34bc3def4c3ab6e2091cd5fb1919f5b35ec607b72" exitCode=0 Oct 02 07:23:40 crc kubenswrapper[4786]: I1002 07:23:40.428053 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rtqd" event={"ID":"fbf0a18c-9a73-4638-b581-3f435abe1fbe","Type":"ContainerDied","Data":"0191b0e856ca0d3fc29949c34bc3def4c3ab6e2091cd5fb1919f5b35ec607b72"} Oct 02 07:23:40 crc kubenswrapper[4786]: I1002 07:23:40.428233 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rtqd" event={"ID":"fbf0a18c-9a73-4638-b581-3f435abe1fbe","Type":"ContainerStarted","Data":"ebb1cbdafed1e28e22deeec7b354420bd5e73f91605656bb717d27c26c1cfe83"} Oct 02 07:23:42 crc kubenswrapper[4786]: I1002 07:23:42.179533 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:23:42 crc kubenswrapper[4786]: E1002 07:23:42.180076 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:23:52 crc kubenswrapper[4786]: I1002 07:23:52.044021 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 07:23:52 crc kubenswrapper[4786]: E1002 07:23:52.071925 4786 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:a0eac564d779a7eaac46c9816bff261a" Oct 02 07:23:52 crc kubenswrapper[4786]: E1002 07:23:52.071963 4786 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:a0eac564d779a7eaac46c9816bff261a" Oct 02 07:23:52 crc kubenswrapper[4786]: E1002 07:23:52.072105 4786 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:a0eac564d779a7eaac46c9816bff261a,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hpkjr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 07:23:52 crc kubenswrapper[4786]: E1002 07:23:52.073493 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9" Oct 02 07:23:52 crc kubenswrapper[4786]: E1002 07:23:52.523026 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:a0eac564d779a7eaac46c9816bff261a\\\"\"" pod="openstack/tempest-tests-tempest" podUID="2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9" Oct 02 07:23:53 crc kubenswrapper[4786]: I1002 07:23:53.529860 4786 generic.go:334] "Generic (PLEG): container finished" podID="fbf0a18c-9a73-4638-b581-3f435abe1fbe" containerID="c7b2e5de1f288ca3c8fa13331264af7e60def1b522ba1139c6bbb8c281c3f749" exitCode=0 Oct 02 07:23:53 crc kubenswrapper[4786]: I1002 07:23:53.529905 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rtqd" event={"ID":"fbf0a18c-9a73-4638-b581-3f435abe1fbe","Type":"ContainerDied","Data":"c7b2e5de1f288ca3c8fa13331264af7e60def1b522ba1139c6bbb8c281c3f749"} Oct 02 07:23:54 crc kubenswrapper[4786]: I1002 07:23:54.179413 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:23:54 crc kubenswrapper[4786]: E1002 07:23:54.179826 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:23:54 crc kubenswrapper[4786]: I1002 07:23:54.538007 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rtqd" event={"ID":"fbf0a18c-9a73-4638-b581-3f435abe1fbe","Type":"ContainerStarted","Data":"f35a6afeb90e307426b3730011d53b65bf011bc038dffa73f5c9c9bdb9948db2"} Oct 02 07:23:54 crc kubenswrapper[4786]: I1002 07:23:54.552128 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9rtqd" podStartSLOduration=13.53169655 podStartE2EDuration="15.552115303s" podCreationTimestamp="2025-10-02 07:23:39 +0000 UTC" firstStartedPulling="2025-10-02 07:23:52.043808968 +0000 UTC m=+2242.164992099" lastFinishedPulling="2025-10-02 07:23:54.064227721 +0000 UTC m=+2244.185410852" observedRunningTime="2025-10-02 07:23:54.549018624 +0000 UTC m=+2244.670201786" watchObservedRunningTime="2025-10-02 07:23:54.552115303 +0000 UTC m=+2244.673298434" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.209368 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2zb7n"] Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.213430 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2zb7n" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.218291 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2zb7n"] Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.364883 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cb7ea57-f8c4-4b2e-977a-6e252dd76141-utilities\") pod \"redhat-operators-2zb7n\" (UID: \"4cb7ea57-f8c4-4b2e-977a-6e252dd76141\") " pod="openshift-marketplace/redhat-operators-2zb7n" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.365460 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmdsc\" (UniqueName: \"kubernetes.io/projected/4cb7ea57-f8c4-4b2e-977a-6e252dd76141-kube-api-access-gmdsc\") pod \"redhat-operators-2zb7n\" (UID: \"4cb7ea57-f8c4-4b2e-977a-6e252dd76141\") " pod="openshift-marketplace/redhat-operators-2zb7n" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.365561 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cb7ea57-f8c4-4b2e-977a-6e252dd76141-catalog-content\") pod \"redhat-operators-2zb7n\" (UID: \"4cb7ea57-f8c4-4b2e-977a-6e252dd76141\") " pod="openshift-marketplace/redhat-operators-2zb7n" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.408671 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-97g9j"] Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.410470 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-97g9j" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.424608 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-97g9j"] Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.467591 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cb7ea57-f8c4-4b2e-977a-6e252dd76141-utilities\") pod \"redhat-operators-2zb7n\" (UID: \"4cb7ea57-f8c4-4b2e-977a-6e252dd76141\") " pod="openshift-marketplace/redhat-operators-2zb7n" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.467929 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmdsc\" (UniqueName: \"kubernetes.io/projected/4cb7ea57-f8c4-4b2e-977a-6e252dd76141-kube-api-access-gmdsc\") pod \"redhat-operators-2zb7n\" (UID: \"4cb7ea57-f8c4-4b2e-977a-6e252dd76141\") " pod="openshift-marketplace/redhat-operators-2zb7n" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.468042 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cb7ea57-f8c4-4b2e-977a-6e252dd76141-utilities\") pod \"redhat-operators-2zb7n\" (UID: \"4cb7ea57-f8c4-4b2e-977a-6e252dd76141\") " pod="openshift-marketplace/redhat-operators-2zb7n" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.468259 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cb7ea57-f8c4-4b2e-977a-6e252dd76141-catalog-content\") pod \"redhat-operators-2zb7n\" (UID: \"4cb7ea57-f8c4-4b2e-977a-6e252dd76141\") " pod="openshift-marketplace/redhat-operators-2zb7n" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.468853 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cb7ea57-f8c4-4b2e-977a-6e252dd76141-catalog-content\") pod \"redhat-operators-2zb7n\" (UID: \"4cb7ea57-f8c4-4b2e-977a-6e252dd76141\") " pod="openshift-marketplace/redhat-operators-2zb7n" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.486055 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmdsc\" (UniqueName: \"kubernetes.io/projected/4cb7ea57-f8c4-4b2e-977a-6e252dd76141-kube-api-access-gmdsc\") pod \"redhat-operators-2zb7n\" (UID: \"4cb7ea57-f8c4-4b2e-977a-6e252dd76141\") " pod="openshift-marketplace/redhat-operators-2zb7n" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.534089 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2zb7n" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.570065 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fc28512-b1f9-4298-b044-528f1ed5da03-catalog-content\") pod \"community-operators-97g9j\" (UID: \"4fc28512-b1f9-4298-b044-528f1ed5da03\") " pod="openshift-marketplace/community-operators-97g9j" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.570120 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fc28512-b1f9-4298-b044-528f1ed5da03-utilities\") pod \"community-operators-97g9j\" (UID: \"4fc28512-b1f9-4298-b044-528f1ed5da03\") " pod="openshift-marketplace/community-operators-97g9j" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.570197 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgb6b\" (UniqueName: \"kubernetes.io/projected/4fc28512-b1f9-4298-b044-528f1ed5da03-kube-api-access-mgb6b\") pod \"community-operators-97g9j\" (UID: \"4fc28512-b1f9-4298-b044-528f1ed5da03\") " pod="openshift-marketplace/community-operators-97g9j" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.671355 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fc28512-b1f9-4298-b044-528f1ed5da03-catalog-content\") pod \"community-operators-97g9j\" (UID: \"4fc28512-b1f9-4298-b044-528f1ed5da03\") " pod="openshift-marketplace/community-operators-97g9j" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.671393 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fc28512-b1f9-4298-b044-528f1ed5da03-utilities\") pod \"community-operators-97g9j\" (UID: \"4fc28512-b1f9-4298-b044-528f1ed5da03\") " pod="openshift-marketplace/community-operators-97g9j" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.671439 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgb6b\" (UniqueName: \"kubernetes.io/projected/4fc28512-b1f9-4298-b044-528f1ed5da03-kube-api-access-mgb6b\") pod \"community-operators-97g9j\" (UID: \"4fc28512-b1f9-4298-b044-528f1ed5da03\") " pod="openshift-marketplace/community-operators-97g9j" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.671857 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fc28512-b1f9-4298-b044-528f1ed5da03-catalog-content\") pod \"community-operators-97g9j\" (UID: \"4fc28512-b1f9-4298-b044-528f1ed5da03\") " pod="openshift-marketplace/community-operators-97g9j" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.672021 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fc28512-b1f9-4298-b044-528f1ed5da03-utilities\") pod \"community-operators-97g9j\" (UID: \"4fc28512-b1f9-4298-b044-528f1ed5da03\") " pod="openshift-marketplace/community-operators-97g9j" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.689666 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgb6b\" (UniqueName: \"kubernetes.io/projected/4fc28512-b1f9-4298-b044-528f1ed5da03-kube-api-access-mgb6b\") pod \"community-operators-97g9j\" (UID: \"4fc28512-b1f9-4298-b044-528f1ed5da03\") " pod="openshift-marketplace/community-operators-97g9j" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.726054 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-97g9j" Oct 02 07:23:58 crc kubenswrapper[4786]: I1002 07:23:58.968853 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2zb7n"] Oct 02 07:23:59 crc kubenswrapper[4786]: I1002 07:23:59.177500 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-97g9j"] Oct 02 07:23:59 crc kubenswrapper[4786]: W1002 07:23:59.185862 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fc28512_b1f9_4298_b044_528f1ed5da03.slice/crio-cfd6f6ac2106830c8712404c3d7ffe36ca290925b4f687791a3e6955f1fcd3d2 WatchSource:0}: Error finding container cfd6f6ac2106830c8712404c3d7ffe36ca290925b4f687791a3e6955f1fcd3d2: Status 404 returned error can't find the container with id cfd6f6ac2106830c8712404c3d7ffe36ca290925b4f687791a3e6955f1fcd3d2 Oct 02 07:23:59 crc kubenswrapper[4786]: I1002 07:23:59.350742 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9rtqd" Oct 02 07:23:59 crc kubenswrapper[4786]: I1002 07:23:59.350800 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9rtqd" Oct 02 07:23:59 crc kubenswrapper[4786]: I1002 07:23:59.381121 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9rtqd" Oct 02 07:23:59 crc kubenswrapper[4786]: I1002 07:23:59.573785 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fc28512-b1f9-4298-b044-528f1ed5da03" containerID="1c24b32ed08dcbcd27c2a1b005780b98541c798e8afdfbe9892578ea992b95bb" exitCode=0 Oct 02 07:23:59 crc kubenswrapper[4786]: I1002 07:23:59.573829 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97g9j" event={"ID":"4fc28512-b1f9-4298-b044-528f1ed5da03","Type":"ContainerDied","Data":"1c24b32ed08dcbcd27c2a1b005780b98541c798e8afdfbe9892578ea992b95bb"} Oct 02 07:23:59 crc kubenswrapper[4786]: I1002 07:23:59.573872 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97g9j" event={"ID":"4fc28512-b1f9-4298-b044-528f1ed5da03","Type":"ContainerStarted","Data":"cfd6f6ac2106830c8712404c3d7ffe36ca290925b4f687791a3e6955f1fcd3d2"} Oct 02 07:23:59 crc kubenswrapper[4786]: I1002 07:23:59.575042 4786 generic.go:334] "Generic (PLEG): container finished" podID="4cb7ea57-f8c4-4b2e-977a-6e252dd76141" containerID="7f3516b0fb98291bcac3797e3bd5b64c20771bef50e99c02be9bba60c6c803b2" exitCode=0 Oct 02 07:23:59 crc kubenswrapper[4786]: I1002 07:23:59.575124 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zb7n" event={"ID":"4cb7ea57-f8c4-4b2e-977a-6e252dd76141","Type":"ContainerDied","Data":"7f3516b0fb98291bcac3797e3bd5b64c20771bef50e99c02be9bba60c6c803b2"} Oct 02 07:23:59 crc kubenswrapper[4786]: I1002 07:23:59.575168 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zb7n" event={"ID":"4cb7ea57-f8c4-4b2e-977a-6e252dd76141","Type":"ContainerStarted","Data":"4cef19847b5fe7079e635a5ea5886b04432401b21b5f4b01b4148419099db8d5"} Oct 02 07:23:59 crc kubenswrapper[4786]: I1002 07:23:59.610806 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9rtqd" Oct 02 07:24:00 crc kubenswrapper[4786]: I1002 07:24:00.583408 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fc28512-b1f9-4298-b044-528f1ed5da03" containerID="01cff184d9c760a094562d62345195e1e2a8cf7698cb5ba6067b25646a527158" exitCode=0 Oct 02 07:24:00 crc kubenswrapper[4786]: I1002 07:24:00.583449 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97g9j" event={"ID":"4fc28512-b1f9-4298-b044-528f1ed5da03","Type":"ContainerDied","Data":"01cff184d9c760a094562d62345195e1e2a8cf7698cb5ba6067b25646a527158"} Oct 02 07:24:00 crc kubenswrapper[4786]: I1002 07:24:00.586960 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zb7n" event={"ID":"4cb7ea57-f8c4-4b2e-977a-6e252dd76141","Type":"ContainerStarted","Data":"e092d1642d58b020f73082203f8eccc2579076dd17b88f22108e5530428ea8e8"} Oct 02 07:24:00 crc kubenswrapper[4786]: I1002 07:24:00.617304 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lqpjl"] Oct 02 07:24:00 crc kubenswrapper[4786]: I1002 07:24:00.618908 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lqpjl" Oct 02 07:24:00 crc kubenswrapper[4786]: I1002 07:24:00.642042 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lqpjl"] Oct 02 07:24:00 crc kubenswrapper[4786]: I1002 07:24:00.706028 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lpnx\" (UniqueName: \"kubernetes.io/projected/32e6a72b-b5cb-4538-8e54-0a48ad8b88a0-kube-api-access-5lpnx\") pod \"certified-operators-lqpjl\" (UID: \"32e6a72b-b5cb-4538-8e54-0a48ad8b88a0\") " pod="openshift-marketplace/certified-operators-lqpjl" Oct 02 07:24:00 crc kubenswrapper[4786]: I1002 07:24:00.706152 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32e6a72b-b5cb-4538-8e54-0a48ad8b88a0-utilities\") pod \"certified-operators-lqpjl\" (UID: \"32e6a72b-b5cb-4538-8e54-0a48ad8b88a0\") " pod="openshift-marketplace/certified-operators-lqpjl" Oct 02 07:24:00 crc kubenswrapper[4786]: I1002 07:24:00.706192 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32e6a72b-b5cb-4538-8e54-0a48ad8b88a0-catalog-content\") pod \"certified-operators-lqpjl\" (UID: \"32e6a72b-b5cb-4538-8e54-0a48ad8b88a0\") " pod="openshift-marketplace/certified-operators-lqpjl" Oct 02 07:24:00 crc kubenswrapper[4786]: I1002 07:24:00.804739 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rtqd"] Oct 02 07:24:00 crc kubenswrapper[4786]: I1002 07:24:00.807370 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32e6a72b-b5cb-4538-8e54-0a48ad8b88a0-catalog-content\") pod \"certified-operators-lqpjl\" (UID: \"32e6a72b-b5cb-4538-8e54-0a48ad8b88a0\") " pod="openshift-marketplace/certified-operators-lqpjl" Oct 02 07:24:00 crc kubenswrapper[4786]: I1002 07:24:00.807495 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lpnx\" (UniqueName: \"kubernetes.io/projected/32e6a72b-b5cb-4538-8e54-0a48ad8b88a0-kube-api-access-5lpnx\") pod \"certified-operators-lqpjl\" (UID: \"32e6a72b-b5cb-4538-8e54-0a48ad8b88a0\") " pod="openshift-marketplace/certified-operators-lqpjl" Oct 02 07:24:00 crc kubenswrapper[4786]: I1002 07:24:00.807671 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32e6a72b-b5cb-4538-8e54-0a48ad8b88a0-utilities\") pod \"certified-operators-lqpjl\" (UID: \"32e6a72b-b5cb-4538-8e54-0a48ad8b88a0\") " pod="openshift-marketplace/certified-operators-lqpjl" Oct 02 07:24:00 crc kubenswrapper[4786]: I1002 07:24:00.807841 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32e6a72b-b5cb-4538-8e54-0a48ad8b88a0-catalog-content\") pod \"certified-operators-lqpjl\" (UID: \"32e6a72b-b5cb-4538-8e54-0a48ad8b88a0\") " pod="openshift-marketplace/certified-operators-lqpjl" Oct 02 07:24:00 crc kubenswrapper[4786]: I1002 07:24:00.808125 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32e6a72b-b5cb-4538-8e54-0a48ad8b88a0-utilities\") pod \"certified-operators-lqpjl\" (UID: \"32e6a72b-b5cb-4538-8e54-0a48ad8b88a0\") " pod="openshift-marketplace/certified-operators-lqpjl" Oct 02 07:24:00 crc kubenswrapper[4786]: I1002 07:24:00.827455 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lpnx\" (UniqueName: \"kubernetes.io/projected/32e6a72b-b5cb-4538-8e54-0a48ad8b88a0-kube-api-access-5lpnx\") pod \"certified-operators-lqpjl\" (UID: \"32e6a72b-b5cb-4538-8e54-0a48ad8b88a0\") " pod="openshift-marketplace/certified-operators-lqpjl" Oct 02 07:24:00 crc kubenswrapper[4786]: I1002 07:24:00.934230 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lqpjl" Oct 02 07:24:01 crc kubenswrapper[4786]: I1002 07:24:01.594160 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97g9j" event={"ID":"4fc28512-b1f9-4298-b044-528f1ed5da03","Type":"ContainerStarted","Data":"d9943da669117b1cf0df4afac144046ca1aeb937a15c7fbca93b772c514afaf9"} Oct 02 07:24:01 crc kubenswrapper[4786]: I1002 07:24:01.594486 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9rtqd" podUID="fbf0a18c-9a73-4638-b581-3f435abe1fbe" containerName="registry-server" containerID="cri-o://f35a6afeb90e307426b3730011d53b65bf011bc038dffa73f5c9c9bdb9948db2" gracePeriod=2 Oct 02 07:24:01 crc kubenswrapper[4786]: I1002 07:24:01.610479 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-97g9j" podStartSLOduration=1.800889535 podStartE2EDuration="3.610466678s" podCreationTimestamp="2025-10-02 07:23:58 +0000 UTC" firstStartedPulling="2025-10-02 07:23:59.575561009 +0000 UTC m=+2249.696744140" lastFinishedPulling="2025-10-02 07:24:01.385138152 +0000 UTC m=+2251.506321283" observedRunningTime="2025-10-02 07:24:01.605712534 +0000 UTC m=+2251.726895675" watchObservedRunningTime="2025-10-02 07:24:01.610466678 +0000 UTC m=+2251.731649809" Oct 02 07:24:01 crc kubenswrapper[4786]: I1002 07:24:01.731756 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lqpjl"] Oct 02 07:24:01 crc kubenswrapper[4786]: W1002 07:24:01.732793 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32e6a72b_b5cb_4538_8e54_0a48ad8b88a0.slice/crio-16b23838618949a10fc39d75a25b6f2487cbd2acc6079d1cfd32a9a840722f14 WatchSource:0}: Error finding container 16b23838618949a10fc39d75a25b6f2487cbd2acc6079d1cfd32a9a840722f14: Status 404 returned error can't find the container with id 16b23838618949a10fc39d75a25b6f2487cbd2acc6079d1cfd32a9a840722f14 Oct 02 07:24:02 crc kubenswrapper[4786]: I1002 07:24:02.602414 4786 generic.go:334] "Generic (PLEG): container finished" podID="4cb7ea57-f8c4-4b2e-977a-6e252dd76141" containerID="e092d1642d58b020f73082203f8eccc2579076dd17b88f22108e5530428ea8e8" exitCode=0 Oct 02 07:24:02 crc kubenswrapper[4786]: I1002 07:24:02.602492 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zb7n" event={"ID":"4cb7ea57-f8c4-4b2e-977a-6e252dd76141","Type":"ContainerDied","Data":"e092d1642d58b020f73082203f8eccc2579076dd17b88f22108e5530428ea8e8"} Oct 02 07:24:02 crc kubenswrapper[4786]: I1002 07:24:02.604805 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqpjl" event={"ID":"32e6a72b-b5cb-4538-8e54-0a48ad8b88a0","Type":"ContainerStarted","Data":"e8d0014cc31e331ddd3bdfde325970c50bac62681810f46a36f487302729fd44"} Oct 02 07:24:02 crc kubenswrapper[4786]: I1002 07:24:02.604841 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqpjl" event={"ID":"32e6a72b-b5cb-4538-8e54-0a48ad8b88a0","Type":"ContainerStarted","Data":"16b23838618949a10fc39d75a25b6f2487cbd2acc6079d1cfd32a9a840722f14"} Oct 02 07:24:03 crc kubenswrapper[4786]: I1002 07:24:03.614126 4786 generic.go:334] "Generic (PLEG): container finished" podID="32e6a72b-b5cb-4538-8e54-0a48ad8b88a0" containerID="e8d0014cc31e331ddd3bdfde325970c50bac62681810f46a36f487302729fd44" exitCode=0 Oct 02 07:24:03 crc kubenswrapper[4786]: I1002 07:24:03.614206 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqpjl" event={"ID":"32e6a72b-b5cb-4538-8e54-0a48ad8b88a0","Type":"ContainerDied","Data":"e8d0014cc31e331ddd3bdfde325970c50bac62681810f46a36f487302729fd44"} Oct 02 07:24:03 crc kubenswrapper[4786]: I1002 07:24:03.618198 4786 generic.go:334] "Generic (PLEG): container finished" podID="fbf0a18c-9a73-4638-b581-3f435abe1fbe" containerID="f35a6afeb90e307426b3730011d53b65bf011bc038dffa73f5c9c9bdb9948db2" exitCode=0 Oct 02 07:24:03 crc kubenswrapper[4786]: I1002 07:24:03.618255 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rtqd" event={"ID":"fbf0a18c-9a73-4638-b581-3f435abe1fbe","Type":"ContainerDied","Data":"f35a6afeb90e307426b3730011d53b65bf011bc038dffa73f5c9c9bdb9948db2"} Oct 02 07:24:03 crc kubenswrapper[4786]: I1002 07:24:03.623830 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zb7n" event={"ID":"4cb7ea57-f8c4-4b2e-977a-6e252dd76141","Type":"ContainerStarted","Data":"0b39c91bddc1dace9ab8375927d6325afe2bd9086dbcc90217634e4eeaa217ae"} Oct 02 07:24:03 crc kubenswrapper[4786]: I1002 07:24:03.654416 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2zb7n" podStartSLOduration=2.151352486 podStartE2EDuration="5.654403775s" podCreationTimestamp="2025-10-02 07:23:58 +0000 UTC" firstStartedPulling="2025-10-02 07:23:59.577711164 +0000 UTC m=+2249.698894294" lastFinishedPulling="2025-10-02 07:24:03.080762452 +0000 UTC m=+2253.201945583" observedRunningTime="2025-10-02 07:24:03.649144718 +0000 UTC m=+2253.770327849" watchObservedRunningTime="2025-10-02 07:24:03.654403775 +0000 UTC m=+2253.775586905" Oct 02 07:24:03 crc kubenswrapper[4786]: I1002 07:24:03.779413 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rtqd" Oct 02 07:24:03 crc kubenswrapper[4786]: I1002 07:24:03.856261 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a18c-9a73-4638-b581-3f435abe1fbe-utilities\") pod \"fbf0a18c-9a73-4638-b581-3f435abe1fbe\" (UID: \"fbf0a18c-9a73-4638-b581-3f435abe1fbe\") " Oct 02 07:24:03 crc kubenswrapper[4786]: I1002 07:24:03.856402 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a18c-9a73-4638-b581-3f435abe1fbe-catalog-content\") pod \"fbf0a18c-9a73-4638-b581-3f435abe1fbe\" (UID: \"fbf0a18c-9a73-4638-b581-3f435abe1fbe\") " Oct 02 07:24:03 crc kubenswrapper[4786]: I1002 07:24:03.856458 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pn9j\" (UniqueName: \"kubernetes.io/projected/fbf0a18c-9a73-4638-b581-3f435abe1fbe-kube-api-access-8pn9j\") pod \"fbf0a18c-9a73-4638-b581-3f435abe1fbe\" (UID: \"fbf0a18c-9a73-4638-b581-3f435abe1fbe\") " Oct 02 07:24:03 crc kubenswrapper[4786]: I1002 07:24:03.856840 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbf0a18c-9a73-4638-b581-3f435abe1fbe-utilities" (OuterVolumeSpecName: "utilities") pod "fbf0a18c-9a73-4638-b581-3f435abe1fbe" (UID: "fbf0a18c-9a73-4638-b581-3f435abe1fbe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:24:03 crc kubenswrapper[4786]: I1002 07:24:03.856948 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a18c-9a73-4638-b581-3f435abe1fbe-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 07:24:03 crc kubenswrapper[4786]: I1002 07:24:03.864876 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbf0a18c-9a73-4638-b581-3f435abe1fbe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbf0a18c-9a73-4638-b581-3f435abe1fbe" (UID: "fbf0a18c-9a73-4638-b581-3f435abe1fbe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:24:03 crc kubenswrapper[4786]: I1002 07:24:03.865342 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbf0a18c-9a73-4638-b581-3f435abe1fbe-kube-api-access-8pn9j" (OuterVolumeSpecName: "kube-api-access-8pn9j") pod "fbf0a18c-9a73-4638-b581-3f435abe1fbe" (UID: "fbf0a18c-9a73-4638-b581-3f435abe1fbe"). InnerVolumeSpecName "kube-api-access-8pn9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:24:03 crc kubenswrapper[4786]: I1002 07:24:03.958664 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf0a18c-9a73-4638-b581-3f435abe1fbe-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 07:24:03 crc kubenswrapper[4786]: I1002 07:24:03.958707 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pn9j\" (UniqueName: \"kubernetes.io/projected/fbf0a18c-9a73-4638-b581-3f435abe1fbe-kube-api-access-8pn9j\") on node \"crc\" DevicePath \"\"" Oct 02 07:24:04 crc kubenswrapper[4786]: I1002 07:24:04.634017 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rtqd" event={"ID":"fbf0a18c-9a73-4638-b581-3f435abe1fbe","Type":"ContainerDied","Data":"ebb1cbdafed1e28e22deeec7b354420bd5e73f91605656bb717d27c26c1cfe83"} Oct 02 07:24:04 crc kubenswrapper[4786]: I1002 07:24:04.634219 4786 scope.go:117] "RemoveContainer" containerID="f35a6afeb90e307426b3730011d53b65bf011bc038dffa73f5c9c9bdb9948db2" Oct 02 07:24:04 crc kubenswrapper[4786]: I1002 07:24:04.634072 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rtqd" Oct 02 07:24:04 crc kubenswrapper[4786]: I1002 07:24:04.651447 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rtqd"] Oct 02 07:24:04 crc kubenswrapper[4786]: I1002 07:24:04.658045 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rtqd"] Oct 02 07:24:04 crc kubenswrapper[4786]: I1002 07:24:04.660020 4786 scope.go:117] "RemoveContainer" containerID="c7b2e5de1f288ca3c8fa13331264af7e60def1b522ba1139c6bbb8c281c3f749" Oct 02 07:24:04 crc kubenswrapper[4786]: I1002 07:24:04.679344 4786 scope.go:117] "RemoveContainer" containerID="0191b0e856ca0d3fc29949c34bc3def4c3ab6e2091cd5fb1919f5b35ec607b72" Oct 02 07:24:06 crc kubenswrapper[4786]: I1002 07:24:06.191913 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbf0a18c-9a73-4638-b581-3f435abe1fbe" path="/var/lib/kubelet/pods/fbf0a18c-9a73-4638-b581-3f435abe1fbe/volumes" Oct 02 07:24:07 crc kubenswrapper[4786]: I1002 07:24:07.959058 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 02 07:24:08 crc kubenswrapper[4786]: I1002 07:24:08.534800 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2zb7n" Oct 02 07:24:08 crc kubenswrapper[4786]: I1002 07:24:08.535020 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2zb7n" Oct 02 07:24:08 crc kubenswrapper[4786]: I1002 07:24:08.637282 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2zb7n" Oct 02 07:24:08 crc kubenswrapper[4786]: I1002 07:24:08.665739 4786 generic.go:334] "Generic (PLEG): container finished" podID="32e6a72b-b5cb-4538-8e54-0a48ad8b88a0" containerID="a672602f9ef2a0d72e9b6a370e57d501037717383bd80c89c67f251f73c6aca5" exitCode=0 Oct 02 07:24:08 crc kubenswrapper[4786]: I1002 07:24:08.665815 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqpjl" event={"ID":"32e6a72b-b5cb-4538-8e54-0a48ad8b88a0","Type":"ContainerDied","Data":"a672602f9ef2a0d72e9b6a370e57d501037717383bd80c89c67f251f73c6aca5"} Oct 02 07:24:08 crc kubenswrapper[4786]: I1002 07:24:08.704397 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2zb7n" Oct 02 07:24:08 crc kubenswrapper[4786]: I1002 07:24:08.727029 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-97g9j" Oct 02 07:24:08 crc kubenswrapper[4786]: I1002 07:24:08.727074 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-97g9j" Oct 02 07:24:08 crc kubenswrapper[4786]: I1002 07:24:08.759443 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-97g9j" Oct 02 07:24:09 crc kubenswrapper[4786]: I1002 07:24:09.179422 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:24:09 crc kubenswrapper[4786]: E1002 07:24:09.179843 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:24:09 crc kubenswrapper[4786]: I1002 07:24:09.674870 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqpjl" event={"ID":"32e6a72b-b5cb-4538-8e54-0a48ad8b88a0","Type":"ContainerStarted","Data":"1b939baf7fc1e50f33f245d6d4199cf7355abe74ba5507e2987965c7f7fde66b"} Oct 02 07:24:09 crc kubenswrapper[4786]: I1002 07:24:09.676182 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9","Type":"ContainerStarted","Data":"4e85807243d2fc5c114c6268f0accbd515caa6fd0c9f6cc05ddbfc24ac979a7f"} Oct 02 07:24:09 crc kubenswrapper[4786]: I1002 07:24:09.693941 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lqpjl" podStartSLOduration=4.039062635 podStartE2EDuration="9.69392477s" podCreationTimestamp="2025-10-02 07:24:00 +0000 UTC" firstStartedPulling="2025-10-02 07:24:03.61536198 +0000 UTC m=+2253.736545111" lastFinishedPulling="2025-10-02 07:24:09.270224115 +0000 UTC m=+2259.391407246" observedRunningTime="2025-10-02 07:24:09.689157171 +0000 UTC m=+2259.810340312" watchObservedRunningTime="2025-10-02 07:24:09.69392477 +0000 UTC m=+2259.815107902" Oct 02 07:24:09 crc kubenswrapper[4786]: I1002 07:24:09.702012 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.756928735 podStartE2EDuration="1m0.701998133s" podCreationTimestamp="2025-10-02 07:23:09 +0000 UTC" firstStartedPulling="2025-10-02 07:23:11.012020279 +0000 UTC m=+2201.133203410" lastFinishedPulling="2025-10-02 07:24:07.957089676 +0000 UTC m=+2258.078272808" observedRunningTime="2025-10-02 07:24:09.701054103 +0000 UTC m=+2259.822237244" watchObservedRunningTime="2025-10-02 07:24:09.701998133 +0000 UTC m=+2259.823181264" Oct 02 07:24:09 crc kubenswrapper[4786]: I1002 07:24:09.708614 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-97g9j" Oct 02 07:24:10 crc kubenswrapper[4786]: I1002 07:24:10.934910 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lqpjl" Oct 02 07:24:10 crc kubenswrapper[4786]: I1002 07:24:10.934953 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lqpjl" Oct 02 07:24:10 crc kubenswrapper[4786]: I1002 07:24:10.975973 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lqpjl" Oct 02 07:24:11 crc kubenswrapper[4786]: I1002 07:24:11.202789 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-97g9j"] Oct 02 07:24:11 crc kubenswrapper[4786]: I1002 07:24:11.688162 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-97g9j" podUID="4fc28512-b1f9-4298-b044-528f1ed5da03" containerName="registry-server" containerID="cri-o://d9943da669117b1cf0df4afac144046ca1aeb937a15c7fbca93b772c514afaf9" gracePeriod=2 Oct 02 07:24:11 crc kubenswrapper[4786]: I1002 07:24:11.804953 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2zb7n"] Oct 02 07:24:11 crc kubenswrapper[4786]: I1002 07:24:11.805276 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2zb7n" podUID="4cb7ea57-f8c4-4b2e-977a-6e252dd76141" containerName="registry-server" containerID="cri-o://0b39c91bddc1dace9ab8375927d6325afe2bd9086dbcc90217634e4eeaa217ae" gracePeriod=2 Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.068798 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-97g9j" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.134706 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2zb7n" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.194910 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgb6b\" (UniqueName: \"kubernetes.io/projected/4fc28512-b1f9-4298-b044-528f1ed5da03-kube-api-access-mgb6b\") pod \"4fc28512-b1f9-4298-b044-528f1ed5da03\" (UID: \"4fc28512-b1f9-4298-b044-528f1ed5da03\") " Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.195580 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fc28512-b1f9-4298-b044-528f1ed5da03-catalog-content\") pod \"4fc28512-b1f9-4298-b044-528f1ed5da03\" (UID: \"4fc28512-b1f9-4298-b044-528f1ed5da03\") " Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.195824 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fc28512-b1f9-4298-b044-528f1ed5da03-utilities\") pod \"4fc28512-b1f9-4298-b044-528f1ed5da03\" (UID: \"4fc28512-b1f9-4298-b044-528f1ed5da03\") " Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.196309 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fc28512-b1f9-4298-b044-528f1ed5da03-utilities" (OuterVolumeSpecName: "utilities") pod "4fc28512-b1f9-4298-b044-528f1ed5da03" (UID: "4fc28512-b1f9-4298-b044-528f1ed5da03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.199434 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fc28512-b1f9-4298-b044-528f1ed5da03-kube-api-access-mgb6b" (OuterVolumeSpecName: "kube-api-access-mgb6b") pod "4fc28512-b1f9-4298-b044-528f1ed5da03" (UID: "4fc28512-b1f9-4298-b044-528f1ed5da03"). InnerVolumeSpecName "kube-api-access-mgb6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.231685 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fc28512-b1f9-4298-b044-528f1ed5da03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4fc28512-b1f9-4298-b044-528f1ed5da03" (UID: "4fc28512-b1f9-4298-b044-528f1ed5da03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.297685 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cb7ea57-f8c4-4b2e-977a-6e252dd76141-catalog-content\") pod \"4cb7ea57-f8c4-4b2e-977a-6e252dd76141\" (UID: \"4cb7ea57-f8c4-4b2e-977a-6e252dd76141\") " Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.297762 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmdsc\" (UniqueName: \"kubernetes.io/projected/4cb7ea57-f8c4-4b2e-977a-6e252dd76141-kube-api-access-gmdsc\") pod \"4cb7ea57-f8c4-4b2e-977a-6e252dd76141\" (UID: \"4cb7ea57-f8c4-4b2e-977a-6e252dd76141\") " Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.297927 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cb7ea57-f8c4-4b2e-977a-6e252dd76141-utilities\") pod \"4cb7ea57-f8c4-4b2e-977a-6e252dd76141\" (UID: \"4cb7ea57-f8c4-4b2e-977a-6e252dd76141\") " Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.298286 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fc28512-b1f9-4298-b044-528f1ed5da03-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.298299 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgb6b\" (UniqueName: \"kubernetes.io/projected/4fc28512-b1f9-4298-b044-528f1ed5da03-kube-api-access-mgb6b\") on node \"crc\" DevicePath \"\"" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.298309 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fc28512-b1f9-4298-b044-528f1ed5da03-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.298403 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cb7ea57-f8c4-4b2e-977a-6e252dd76141-utilities" (OuterVolumeSpecName: "utilities") pod "4cb7ea57-f8c4-4b2e-977a-6e252dd76141" (UID: "4cb7ea57-f8c4-4b2e-977a-6e252dd76141"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.301637 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cb7ea57-f8c4-4b2e-977a-6e252dd76141-kube-api-access-gmdsc" (OuterVolumeSpecName: "kube-api-access-gmdsc") pod "4cb7ea57-f8c4-4b2e-977a-6e252dd76141" (UID: "4cb7ea57-f8c4-4b2e-977a-6e252dd76141"). InnerVolumeSpecName "kube-api-access-gmdsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.360887 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cb7ea57-f8c4-4b2e-977a-6e252dd76141-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4cb7ea57-f8c4-4b2e-977a-6e252dd76141" (UID: "4cb7ea57-f8c4-4b2e-977a-6e252dd76141"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.399492 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cb7ea57-f8c4-4b2e-977a-6e252dd76141-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.399516 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cb7ea57-f8c4-4b2e-977a-6e252dd76141-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.399527 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmdsc\" (UniqueName: \"kubernetes.io/projected/4cb7ea57-f8c4-4b2e-977a-6e252dd76141-kube-api-access-gmdsc\") on node \"crc\" DevicePath \"\"" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.696037 4786 generic.go:334] "Generic (PLEG): container finished" podID="4fc28512-b1f9-4298-b044-528f1ed5da03" containerID="d9943da669117b1cf0df4afac144046ca1aeb937a15c7fbca93b772c514afaf9" exitCode=0 Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.696084 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-97g9j" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.696075 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97g9j" event={"ID":"4fc28512-b1f9-4298-b044-528f1ed5da03","Type":"ContainerDied","Data":"d9943da669117b1cf0df4afac144046ca1aeb937a15c7fbca93b772c514afaf9"} Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.696436 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-97g9j" event={"ID":"4fc28512-b1f9-4298-b044-528f1ed5da03","Type":"ContainerDied","Data":"cfd6f6ac2106830c8712404c3d7ffe36ca290925b4f687791a3e6955f1fcd3d2"} Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.696458 4786 scope.go:117] "RemoveContainer" containerID="d9943da669117b1cf0df4afac144046ca1aeb937a15c7fbca93b772c514afaf9" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.698663 4786 generic.go:334] "Generic (PLEG): container finished" podID="4cb7ea57-f8c4-4b2e-977a-6e252dd76141" containerID="0b39c91bddc1dace9ab8375927d6325afe2bd9086dbcc90217634e4eeaa217ae" exitCode=0 Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.698710 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2zb7n" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.698732 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zb7n" event={"ID":"4cb7ea57-f8c4-4b2e-977a-6e252dd76141","Type":"ContainerDied","Data":"0b39c91bddc1dace9ab8375927d6325afe2bd9086dbcc90217634e4eeaa217ae"} Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.698788 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2zb7n" event={"ID":"4cb7ea57-f8c4-4b2e-977a-6e252dd76141","Type":"ContainerDied","Data":"4cef19847b5fe7079e635a5ea5886b04432401b21b5f4b01b4148419099db8d5"} Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.713425 4786 scope.go:117] "RemoveContainer" containerID="01cff184d9c760a094562d62345195e1e2a8cf7698cb5ba6067b25646a527158" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.724708 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-97g9j"] Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.731756 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-97g9j"] Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.737680 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2zb7n"] Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.742388 4786 scope.go:117] "RemoveContainer" containerID="1c24b32ed08dcbcd27c2a1b005780b98541c798e8afdfbe9892578ea992b95bb" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.744036 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2zb7n"] Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.755397 4786 scope.go:117] "RemoveContainer" containerID="d9943da669117b1cf0df4afac144046ca1aeb937a15c7fbca93b772c514afaf9" Oct 02 07:24:12 crc kubenswrapper[4786]: E1002 07:24:12.755659 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9943da669117b1cf0df4afac144046ca1aeb937a15c7fbca93b772c514afaf9\": container with ID starting with d9943da669117b1cf0df4afac144046ca1aeb937a15c7fbca93b772c514afaf9 not found: ID does not exist" containerID="d9943da669117b1cf0df4afac144046ca1aeb937a15c7fbca93b772c514afaf9" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.755710 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9943da669117b1cf0df4afac144046ca1aeb937a15c7fbca93b772c514afaf9"} err="failed to get container status \"d9943da669117b1cf0df4afac144046ca1aeb937a15c7fbca93b772c514afaf9\": rpc error: code = NotFound desc = could not find container \"d9943da669117b1cf0df4afac144046ca1aeb937a15c7fbca93b772c514afaf9\": container with ID starting with d9943da669117b1cf0df4afac144046ca1aeb937a15c7fbca93b772c514afaf9 not found: ID does not exist" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.755740 4786 scope.go:117] "RemoveContainer" containerID="01cff184d9c760a094562d62345195e1e2a8cf7698cb5ba6067b25646a527158" Oct 02 07:24:12 crc kubenswrapper[4786]: E1002 07:24:12.756008 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01cff184d9c760a094562d62345195e1e2a8cf7698cb5ba6067b25646a527158\": container with ID starting with 01cff184d9c760a094562d62345195e1e2a8cf7698cb5ba6067b25646a527158 not found: ID does not exist" containerID="01cff184d9c760a094562d62345195e1e2a8cf7698cb5ba6067b25646a527158" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.756037 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01cff184d9c760a094562d62345195e1e2a8cf7698cb5ba6067b25646a527158"} err="failed to get container status \"01cff184d9c760a094562d62345195e1e2a8cf7698cb5ba6067b25646a527158\": rpc error: code = NotFound desc = could not find container \"01cff184d9c760a094562d62345195e1e2a8cf7698cb5ba6067b25646a527158\": container with ID starting with 01cff184d9c760a094562d62345195e1e2a8cf7698cb5ba6067b25646a527158 not found: ID does not exist" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.756055 4786 scope.go:117] "RemoveContainer" containerID="1c24b32ed08dcbcd27c2a1b005780b98541c798e8afdfbe9892578ea992b95bb" Oct 02 07:24:12 crc kubenswrapper[4786]: E1002 07:24:12.756288 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c24b32ed08dcbcd27c2a1b005780b98541c798e8afdfbe9892578ea992b95bb\": container with ID starting with 1c24b32ed08dcbcd27c2a1b005780b98541c798e8afdfbe9892578ea992b95bb not found: ID does not exist" containerID="1c24b32ed08dcbcd27c2a1b005780b98541c798e8afdfbe9892578ea992b95bb" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.756312 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c24b32ed08dcbcd27c2a1b005780b98541c798e8afdfbe9892578ea992b95bb"} err="failed to get container status \"1c24b32ed08dcbcd27c2a1b005780b98541c798e8afdfbe9892578ea992b95bb\": rpc error: code = NotFound desc = could not find container \"1c24b32ed08dcbcd27c2a1b005780b98541c798e8afdfbe9892578ea992b95bb\": container with ID starting with 1c24b32ed08dcbcd27c2a1b005780b98541c798e8afdfbe9892578ea992b95bb not found: ID does not exist" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.756326 4786 scope.go:117] "RemoveContainer" containerID="0b39c91bddc1dace9ab8375927d6325afe2bd9086dbcc90217634e4eeaa217ae" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.770129 4786 scope.go:117] "RemoveContainer" containerID="e092d1642d58b020f73082203f8eccc2579076dd17b88f22108e5530428ea8e8" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.783891 4786 scope.go:117] "RemoveContainer" containerID="7f3516b0fb98291bcac3797e3bd5b64c20771bef50e99c02be9bba60c6c803b2" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.796254 4786 scope.go:117] "RemoveContainer" containerID="0b39c91bddc1dace9ab8375927d6325afe2bd9086dbcc90217634e4eeaa217ae" Oct 02 07:24:12 crc kubenswrapper[4786]: E1002 07:24:12.796513 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b39c91bddc1dace9ab8375927d6325afe2bd9086dbcc90217634e4eeaa217ae\": container with ID starting with 0b39c91bddc1dace9ab8375927d6325afe2bd9086dbcc90217634e4eeaa217ae not found: ID does not exist" containerID="0b39c91bddc1dace9ab8375927d6325afe2bd9086dbcc90217634e4eeaa217ae" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.796537 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b39c91bddc1dace9ab8375927d6325afe2bd9086dbcc90217634e4eeaa217ae"} err="failed to get container status \"0b39c91bddc1dace9ab8375927d6325afe2bd9086dbcc90217634e4eeaa217ae\": rpc error: code = NotFound desc = could not find container \"0b39c91bddc1dace9ab8375927d6325afe2bd9086dbcc90217634e4eeaa217ae\": container with ID starting with 0b39c91bddc1dace9ab8375927d6325afe2bd9086dbcc90217634e4eeaa217ae not found: ID does not exist" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.796563 4786 scope.go:117] "RemoveContainer" containerID="e092d1642d58b020f73082203f8eccc2579076dd17b88f22108e5530428ea8e8" Oct 02 07:24:12 crc kubenswrapper[4786]: E1002 07:24:12.796888 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e092d1642d58b020f73082203f8eccc2579076dd17b88f22108e5530428ea8e8\": container with ID starting with e092d1642d58b020f73082203f8eccc2579076dd17b88f22108e5530428ea8e8 not found: ID does not exist" containerID="e092d1642d58b020f73082203f8eccc2579076dd17b88f22108e5530428ea8e8" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.796916 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e092d1642d58b020f73082203f8eccc2579076dd17b88f22108e5530428ea8e8"} err="failed to get container status \"e092d1642d58b020f73082203f8eccc2579076dd17b88f22108e5530428ea8e8\": rpc error: code = NotFound desc = could not find container \"e092d1642d58b020f73082203f8eccc2579076dd17b88f22108e5530428ea8e8\": container with ID starting with e092d1642d58b020f73082203f8eccc2579076dd17b88f22108e5530428ea8e8 not found: ID does not exist" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.796935 4786 scope.go:117] "RemoveContainer" containerID="7f3516b0fb98291bcac3797e3bd5b64c20771bef50e99c02be9bba60c6c803b2" Oct 02 07:24:12 crc kubenswrapper[4786]: E1002 07:24:12.797124 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f3516b0fb98291bcac3797e3bd5b64c20771bef50e99c02be9bba60c6c803b2\": container with ID starting with 7f3516b0fb98291bcac3797e3bd5b64c20771bef50e99c02be9bba60c6c803b2 not found: ID does not exist" containerID="7f3516b0fb98291bcac3797e3bd5b64c20771bef50e99c02be9bba60c6c803b2" Oct 02 07:24:12 crc kubenswrapper[4786]: I1002 07:24:12.797142 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f3516b0fb98291bcac3797e3bd5b64c20771bef50e99c02be9bba60c6c803b2"} err="failed to get container status \"7f3516b0fb98291bcac3797e3bd5b64c20771bef50e99c02be9bba60c6c803b2\": rpc error: code = NotFound desc = could not find container \"7f3516b0fb98291bcac3797e3bd5b64c20771bef50e99c02be9bba60c6c803b2\": container with ID starting with 7f3516b0fb98291bcac3797e3bd5b64c20771bef50e99c02be9bba60c6c803b2 not found: ID does not exist" Oct 02 07:24:14 crc kubenswrapper[4786]: I1002 07:24:14.199287 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cb7ea57-f8c4-4b2e-977a-6e252dd76141" path="/var/lib/kubelet/pods/4cb7ea57-f8c4-4b2e-977a-6e252dd76141/volumes" Oct 02 07:24:14 crc kubenswrapper[4786]: I1002 07:24:14.200258 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fc28512-b1f9-4298-b044-528f1ed5da03" path="/var/lib/kubelet/pods/4fc28512-b1f9-4298-b044-528f1ed5da03/volumes" Oct 02 07:24:20 crc kubenswrapper[4786]: I1002 07:24:20.970491 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lqpjl" Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.016869 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lqpjl"] Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.044249 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sdlr9"] Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.044506 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sdlr9" podUID="033778c0-6165-45d9-b4fc-87d0cadd03d1" containerName="registry-server" containerID="cri-o://dfe9b202192c72d05b9e0a0c5b2d72f90d59fa623198bb82d77d03ca1fc09036" gracePeriod=2 Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.435899 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sdlr9" Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.550220 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033778c0-6165-45d9-b4fc-87d0cadd03d1-catalog-content\") pod \"033778c0-6165-45d9-b4fc-87d0cadd03d1\" (UID: \"033778c0-6165-45d9-b4fc-87d0cadd03d1\") " Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.550378 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033778c0-6165-45d9-b4fc-87d0cadd03d1-utilities\") pod \"033778c0-6165-45d9-b4fc-87d0cadd03d1\" (UID: \"033778c0-6165-45d9-b4fc-87d0cadd03d1\") " Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.550405 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5mng\" (UniqueName: \"kubernetes.io/projected/033778c0-6165-45d9-b4fc-87d0cadd03d1-kube-api-access-q5mng\") pod \"033778c0-6165-45d9-b4fc-87d0cadd03d1\" (UID: \"033778c0-6165-45d9-b4fc-87d0cadd03d1\") " Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.550914 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/033778c0-6165-45d9-b4fc-87d0cadd03d1-utilities" (OuterVolumeSpecName: "utilities") pod "033778c0-6165-45d9-b4fc-87d0cadd03d1" (UID: "033778c0-6165-45d9-b4fc-87d0cadd03d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.564887 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/033778c0-6165-45d9-b4fc-87d0cadd03d1-kube-api-access-q5mng" (OuterVolumeSpecName: "kube-api-access-q5mng") pod "033778c0-6165-45d9-b4fc-87d0cadd03d1" (UID: "033778c0-6165-45d9-b4fc-87d0cadd03d1"). InnerVolumeSpecName "kube-api-access-q5mng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.591935 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/033778c0-6165-45d9-b4fc-87d0cadd03d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "033778c0-6165-45d9-b4fc-87d0cadd03d1" (UID: "033778c0-6165-45d9-b4fc-87d0cadd03d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.652641 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/033778c0-6165-45d9-b4fc-87d0cadd03d1-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.652666 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5mng\" (UniqueName: \"kubernetes.io/projected/033778c0-6165-45d9-b4fc-87d0cadd03d1-kube-api-access-q5mng\") on node \"crc\" DevicePath \"\"" Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.652677 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/033778c0-6165-45d9-b4fc-87d0cadd03d1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.756053 4786 generic.go:334] "Generic (PLEG): container finished" podID="033778c0-6165-45d9-b4fc-87d0cadd03d1" containerID="dfe9b202192c72d05b9e0a0c5b2d72f90d59fa623198bb82d77d03ca1fc09036" exitCode=0 Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.756460 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sdlr9" Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.756453 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdlr9" event={"ID":"033778c0-6165-45d9-b4fc-87d0cadd03d1","Type":"ContainerDied","Data":"dfe9b202192c72d05b9e0a0c5b2d72f90d59fa623198bb82d77d03ca1fc09036"} Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.756505 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sdlr9" event={"ID":"033778c0-6165-45d9-b4fc-87d0cadd03d1","Type":"ContainerDied","Data":"0a08ea7a3fc046a4d094012af49f7fe7cecea4489c1ac68b6aeafa56e0b7d6e7"} Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.756521 4786 scope.go:117] "RemoveContainer" containerID="dfe9b202192c72d05b9e0a0c5b2d72f90d59fa623198bb82d77d03ca1fc09036" Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.775490 4786 scope.go:117] "RemoveContainer" containerID="3171adffa0c3ae12bf8395d1a68d45cbadc4c7a1ecc0da515cb10f06c0fb9884" Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.783071 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sdlr9"] Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.787823 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sdlr9"] Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.818152 4786 scope.go:117] "RemoveContainer" containerID="b7083fc3104c11df5485d8c906913ca1fe23472a3a359b1699b10750a846412a" Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.844853 4786 scope.go:117] "RemoveContainer" containerID="dfe9b202192c72d05b9e0a0c5b2d72f90d59fa623198bb82d77d03ca1fc09036" Oct 02 07:24:21 crc kubenswrapper[4786]: E1002 07:24:21.845250 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfe9b202192c72d05b9e0a0c5b2d72f90d59fa623198bb82d77d03ca1fc09036\": container with ID starting with dfe9b202192c72d05b9e0a0c5b2d72f90d59fa623198bb82d77d03ca1fc09036 not found: ID does not exist" containerID="dfe9b202192c72d05b9e0a0c5b2d72f90d59fa623198bb82d77d03ca1fc09036" Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.845288 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfe9b202192c72d05b9e0a0c5b2d72f90d59fa623198bb82d77d03ca1fc09036"} err="failed to get container status \"dfe9b202192c72d05b9e0a0c5b2d72f90d59fa623198bb82d77d03ca1fc09036\": rpc error: code = NotFound desc = could not find container \"dfe9b202192c72d05b9e0a0c5b2d72f90d59fa623198bb82d77d03ca1fc09036\": container with ID starting with dfe9b202192c72d05b9e0a0c5b2d72f90d59fa623198bb82d77d03ca1fc09036 not found: ID does not exist" Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.845314 4786 scope.go:117] "RemoveContainer" containerID="3171adffa0c3ae12bf8395d1a68d45cbadc4c7a1ecc0da515cb10f06c0fb9884" Oct 02 07:24:21 crc kubenswrapper[4786]: E1002 07:24:21.853888 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3171adffa0c3ae12bf8395d1a68d45cbadc4c7a1ecc0da515cb10f06c0fb9884\": container with ID starting with 3171adffa0c3ae12bf8395d1a68d45cbadc4c7a1ecc0da515cb10f06c0fb9884 not found: ID does not exist" containerID="3171adffa0c3ae12bf8395d1a68d45cbadc4c7a1ecc0da515cb10f06c0fb9884" Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.853929 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3171adffa0c3ae12bf8395d1a68d45cbadc4c7a1ecc0da515cb10f06c0fb9884"} err="failed to get container status \"3171adffa0c3ae12bf8395d1a68d45cbadc4c7a1ecc0da515cb10f06c0fb9884\": rpc error: code = NotFound desc = could not find container \"3171adffa0c3ae12bf8395d1a68d45cbadc4c7a1ecc0da515cb10f06c0fb9884\": container with ID starting with 3171adffa0c3ae12bf8395d1a68d45cbadc4c7a1ecc0da515cb10f06c0fb9884 not found: ID does not exist" Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.853953 4786 scope.go:117] "RemoveContainer" containerID="b7083fc3104c11df5485d8c906913ca1fe23472a3a359b1699b10750a846412a" Oct 02 07:24:21 crc kubenswrapper[4786]: E1002 07:24:21.854359 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7083fc3104c11df5485d8c906913ca1fe23472a3a359b1699b10750a846412a\": container with ID starting with b7083fc3104c11df5485d8c906913ca1fe23472a3a359b1699b10750a846412a not found: ID does not exist" containerID="b7083fc3104c11df5485d8c906913ca1fe23472a3a359b1699b10750a846412a" Oct 02 07:24:21 crc kubenswrapper[4786]: I1002 07:24:21.854393 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7083fc3104c11df5485d8c906913ca1fe23472a3a359b1699b10750a846412a"} err="failed to get container status \"b7083fc3104c11df5485d8c906913ca1fe23472a3a359b1699b10750a846412a\": rpc error: code = NotFound desc = could not find container \"b7083fc3104c11df5485d8c906913ca1fe23472a3a359b1699b10750a846412a\": container with ID starting with b7083fc3104c11df5485d8c906913ca1fe23472a3a359b1699b10750a846412a not found: ID does not exist" Oct 02 07:24:22 crc kubenswrapper[4786]: I1002 07:24:22.179740 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:24:22 crc kubenswrapper[4786]: E1002 07:24:22.180000 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:24:22 crc kubenswrapper[4786]: I1002 07:24:22.189777 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="033778c0-6165-45d9-b4fc-87d0cadd03d1" path="/var/lib/kubelet/pods/033778c0-6165-45d9-b4fc-87d0cadd03d1/volumes" Oct 02 07:24:33 crc kubenswrapper[4786]: I1002 07:24:33.179826 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:24:33 crc kubenswrapper[4786]: E1002 07:24:33.180386 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:24:44 crc kubenswrapper[4786]: I1002 07:24:44.179280 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:24:44 crc kubenswrapper[4786]: E1002 07:24:44.180075 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:24:58 crc kubenswrapper[4786]: I1002 07:24:58.179320 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:24:58 crc kubenswrapper[4786]: E1002 07:24:58.179876 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:25:11 crc kubenswrapper[4786]: I1002 07:25:11.178832 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:25:11 crc kubenswrapper[4786]: E1002 07:25:11.179364 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:25:22 crc kubenswrapper[4786]: I1002 07:25:22.178930 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:25:22 crc kubenswrapper[4786]: E1002 07:25:22.179508 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:25:35 crc kubenswrapper[4786]: I1002 07:25:35.178797 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:25:35 crc kubenswrapper[4786]: E1002 07:25:35.179322 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:25:48 crc kubenswrapper[4786]: I1002 07:25:48.179302 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:25:48 crc kubenswrapper[4786]: E1002 07:25:48.180105 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:26:03 crc kubenswrapper[4786]: I1002 07:26:03.179094 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:26:03 crc kubenswrapper[4786]: E1002 07:26:03.179881 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:26:15 crc kubenswrapper[4786]: I1002 07:26:15.180672 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:26:15 crc kubenswrapper[4786]: E1002 07:26:15.181654 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:26:30 crc kubenswrapper[4786]: I1002 07:26:30.184609 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:26:30 crc kubenswrapper[4786]: E1002 07:26:30.185284 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:26:44 crc kubenswrapper[4786]: I1002 07:26:44.179232 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:26:44 crc kubenswrapper[4786]: E1002 07:26:44.179936 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:26:59 crc kubenswrapper[4786]: I1002 07:26:59.179720 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:26:59 crc kubenswrapper[4786]: E1002 07:26:59.180441 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:27:11 crc kubenswrapper[4786]: I1002 07:27:11.178796 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:27:11 crc kubenswrapper[4786]: E1002 07:27:11.179341 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:27:25 crc kubenswrapper[4786]: I1002 07:27:25.179018 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:27:25 crc kubenswrapper[4786]: E1002 07:27:25.179463 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:27:39 crc kubenswrapper[4786]: I1002 07:27:39.179560 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:27:39 crc kubenswrapper[4786]: E1002 07:27:39.180115 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:27:50 crc kubenswrapper[4786]: I1002 07:27:50.184027 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:27:50 crc kubenswrapper[4786]: E1002 07:27:50.184590 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:28:05 crc kubenswrapper[4786]: I1002 07:28:05.179567 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:28:05 crc kubenswrapper[4786]: E1002 07:28:05.180183 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:28:18 crc kubenswrapper[4786]: I1002 07:28:18.179572 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:28:18 crc kubenswrapper[4786]: E1002 07:28:18.180352 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:28:31 crc kubenswrapper[4786]: I1002 07:28:31.179389 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:28:32 crc kubenswrapper[4786]: I1002 07:28:32.250102 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerStarted","Data":"834d772f522a6b607bc93d02371b22f2e883bf16a4b9df657cb1c41e0c68d418"} Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.130528 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323170-s5spr"] Oct 02 07:30:00 crc kubenswrapper[4786]: E1002 07:30:00.131412 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033778c0-6165-45d9-b4fc-87d0cadd03d1" containerName="extract-content" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.131425 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="033778c0-6165-45d9-b4fc-87d0cadd03d1" containerName="extract-content" Oct 02 07:30:00 crc kubenswrapper[4786]: E1002 07:30:00.131441 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf0a18c-9a73-4638-b581-3f435abe1fbe" containerName="extract-utilities" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.131448 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf0a18c-9a73-4638-b581-3f435abe1fbe" containerName="extract-utilities" Oct 02 07:30:00 crc kubenswrapper[4786]: E1002 07:30:00.131461 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fc28512-b1f9-4298-b044-528f1ed5da03" containerName="extract-utilities" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.131467 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc28512-b1f9-4298-b044-528f1ed5da03" containerName="extract-utilities" Oct 02 07:30:00 crc kubenswrapper[4786]: E1002 07:30:00.131484 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb7ea57-f8c4-4b2e-977a-6e252dd76141" containerName="extract-content" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.131490 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb7ea57-f8c4-4b2e-977a-6e252dd76141" containerName="extract-content" Oct 02 07:30:00 crc kubenswrapper[4786]: E1002 07:30:00.131499 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf0a18c-9a73-4638-b581-3f435abe1fbe" containerName="extract-content" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.131504 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf0a18c-9a73-4638-b581-3f435abe1fbe" containerName="extract-content" Oct 02 07:30:00 crc kubenswrapper[4786]: E1002 07:30:00.131517 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fc28512-b1f9-4298-b044-528f1ed5da03" containerName="registry-server" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.131522 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc28512-b1f9-4298-b044-528f1ed5da03" containerName="registry-server" Oct 02 07:30:00 crc kubenswrapper[4786]: E1002 07:30:00.131536 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fc28512-b1f9-4298-b044-528f1ed5da03" containerName="extract-content" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.131541 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc28512-b1f9-4298-b044-528f1ed5da03" containerName="extract-content" Oct 02 07:30:00 crc kubenswrapper[4786]: E1002 07:30:00.131553 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033778c0-6165-45d9-b4fc-87d0cadd03d1" containerName="registry-server" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.131558 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="033778c0-6165-45d9-b4fc-87d0cadd03d1" containerName="registry-server" Oct 02 07:30:00 crc kubenswrapper[4786]: E1002 07:30:00.131569 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf0a18c-9a73-4638-b581-3f435abe1fbe" containerName="registry-server" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.131574 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf0a18c-9a73-4638-b581-3f435abe1fbe" containerName="registry-server" Oct 02 07:30:00 crc kubenswrapper[4786]: E1002 07:30:00.131585 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb7ea57-f8c4-4b2e-977a-6e252dd76141" containerName="extract-utilities" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.131591 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb7ea57-f8c4-4b2e-977a-6e252dd76141" containerName="extract-utilities" Oct 02 07:30:00 crc kubenswrapper[4786]: E1002 07:30:00.131601 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033778c0-6165-45d9-b4fc-87d0cadd03d1" containerName="extract-utilities" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.131606 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="033778c0-6165-45d9-b4fc-87d0cadd03d1" containerName="extract-utilities" Oct 02 07:30:00 crc kubenswrapper[4786]: E1002 07:30:00.131614 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb7ea57-f8c4-4b2e-977a-6e252dd76141" containerName="registry-server" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.131620 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb7ea57-f8c4-4b2e-977a-6e252dd76141" containerName="registry-server" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.131791 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cb7ea57-f8c4-4b2e-977a-6e252dd76141" containerName="registry-server" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.131820 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbf0a18c-9a73-4638-b581-3f435abe1fbe" containerName="registry-server" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.131832 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="033778c0-6165-45d9-b4fc-87d0cadd03d1" containerName="registry-server" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.131839 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fc28512-b1f9-4298-b044-528f1ed5da03" containerName="registry-server" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.132385 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323170-s5spr" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.135668 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.135758 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.138451 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323170-s5spr"] Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.187872 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzvtn\" (UniqueName: \"kubernetes.io/projected/4c9b3b96-57a4-4d81-b973-51bc43ceaf8e-kube-api-access-gzvtn\") pod \"collect-profiles-29323170-s5spr\" (UID: \"4c9b3b96-57a4-4d81-b973-51bc43ceaf8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323170-s5spr" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.187927 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c9b3b96-57a4-4d81-b973-51bc43ceaf8e-secret-volume\") pod \"collect-profiles-29323170-s5spr\" (UID: \"4c9b3b96-57a4-4d81-b973-51bc43ceaf8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323170-s5spr" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.187946 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c9b3b96-57a4-4d81-b973-51bc43ceaf8e-config-volume\") pod \"collect-profiles-29323170-s5spr\" (UID: \"4c9b3b96-57a4-4d81-b973-51bc43ceaf8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323170-s5spr" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.289304 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzvtn\" (UniqueName: \"kubernetes.io/projected/4c9b3b96-57a4-4d81-b973-51bc43ceaf8e-kube-api-access-gzvtn\") pod \"collect-profiles-29323170-s5spr\" (UID: \"4c9b3b96-57a4-4d81-b973-51bc43ceaf8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323170-s5spr" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.289354 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c9b3b96-57a4-4d81-b973-51bc43ceaf8e-secret-volume\") pod \"collect-profiles-29323170-s5spr\" (UID: \"4c9b3b96-57a4-4d81-b973-51bc43ceaf8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323170-s5spr" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.289375 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c9b3b96-57a4-4d81-b973-51bc43ceaf8e-config-volume\") pod \"collect-profiles-29323170-s5spr\" (UID: \"4c9b3b96-57a4-4d81-b973-51bc43ceaf8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323170-s5spr" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.290170 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c9b3b96-57a4-4d81-b973-51bc43ceaf8e-config-volume\") pod \"collect-profiles-29323170-s5spr\" (UID: \"4c9b3b96-57a4-4d81-b973-51bc43ceaf8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323170-s5spr" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.293740 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c9b3b96-57a4-4d81-b973-51bc43ceaf8e-secret-volume\") pod \"collect-profiles-29323170-s5spr\" (UID: \"4c9b3b96-57a4-4d81-b973-51bc43ceaf8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323170-s5spr" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.302557 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzvtn\" (UniqueName: \"kubernetes.io/projected/4c9b3b96-57a4-4d81-b973-51bc43ceaf8e-kube-api-access-gzvtn\") pod \"collect-profiles-29323170-s5spr\" (UID: \"4c9b3b96-57a4-4d81-b973-51bc43ceaf8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323170-s5spr" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.448672 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323170-s5spr" Oct 02 07:30:00 crc kubenswrapper[4786]: I1002 07:30:00.816343 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323170-s5spr"] Oct 02 07:30:01 crc kubenswrapper[4786]: I1002 07:30:01.803172 4786 generic.go:334] "Generic (PLEG): container finished" podID="4c9b3b96-57a4-4d81-b973-51bc43ceaf8e" containerID="7ea7cd1c72b4b630000b16b2370ca492bc192cd0fcd2a14eb1acdd54d8e21f81" exitCode=0 Oct 02 07:30:01 crc kubenswrapper[4786]: I1002 07:30:01.803223 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323170-s5spr" event={"ID":"4c9b3b96-57a4-4d81-b973-51bc43ceaf8e","Type":"ContainerDied","Data":"7ea7cd1c72b4b630000b16b2370ca492bc192cd0fcd2a14eb1acdd54d8e21f81"} Oct 02 07:30:01 crc kubenswrapper[4786]: I1002 07:30:01.803596 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323170-s5spr" event={"ID":"4c9b3b96-57a4-4d81-b973-51bc43ceaf8e","Type":"ContainerStarted","Data":"02ba0f60ce34b34fe4a89e27a4de581714a3b591d77c062c1ed75d839953be67"} Oct 02 07:30:03 crc kubenswrapper[4786]: I1002 07:30:03.063612 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323170-s5spr" Oct 02 07:30:03 crc kubenswrapper[4786]: I1002 07:30:03.237062 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c9b3b96-57a4-4d81-b973-51bc43ceaf8e-secret-volume\") pod \"4c9b3b96-57a4-4d81-b973-51bc43ceaf8e\" (UID: \"4c9b3b96-57a4-4d81-b973-51bc43ceaf8e\") " Oct 02 07:30:03 crc kubenswrapper[4786]: I1002 07:30:03.237155 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzvtn\" (UniqueName: \"kubernetes.io/projected/4c9b3b96-57a4-4d81-b973-51bc43ceaf8e-kube-api-access-gzvtn\") pod \"4c9b3b96-57a4-4d81-b973-51bc43ceaf8e\" (UID: \"4c9b3b96-57a4-4d81-b973-51bc43ceaf8e\") " Oct 02 07:30:03 crc kubenswrapper[4786]: I1002 07:30:03.237185 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c9b3b96-57a4-4d81-b973-51bc43ceaf8e-config-volume\") pod \"4c9b3b96-57a4-4d81-b973-51bc43ceaf8e\" (UID: \"4c9b3b96-57a4-4d81-b973-51bc43ceaf8e\") " Oct 02 07:30:03 crc kubenswrapper[4786]: I1002 07:30:03.237749 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c9b3b96-57a4-4d81-b973-51bc43ceaf8e-config-volume" (OuterVolumeSpecName: "config-volume") pod "4c9b3b96-57a4-4d81-b973-51bc43ceaf8e" (UID: "4c9b3b96-57a4-4d81-b973-51bc43ceaf8e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:30:03 crc kubenswrapper[4786]: I1002 07:30:03.242043 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c9b3b96-57a4-4d81-b973-51bc43ceaf8e-kube-api-access-gzvtn" (OuterVolumeSpecName: "kube-api-access-gzvtn") pod "4c9b3b96-57a4-4d81-b973-51bc43ceaf8e" (UID: "4c9b3b96-57a4-4d81-b973-51bc43ceaf8e"). InnerVolumeSpecName "kube-api-access-gzvtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:30:03 crc kubenswrapper[4786]: I1002 07:30:03.242227 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c9b3b96-57a4-4d81-b973-51bc43ceaf8e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4c9b3b96-57a4-4d81-b973-51bc43ceaf8e" (UID: "4c9b3b96-57a4-4d81-b973-51bc43ceaf8e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:30:03 crc kubenswrapper[4786]: I1002 07:30:03.339503 4786 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c9b3b96-57a4-4d81-b973-51bc43ceaf8e-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 07:30:03 crc kubenswrapper[4786]: I1002 07:30:03.339531 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzvtn\" (UniqueName: \"kubernetes.io/projected/4c9b3b96-57a4-4d81-b973-51bc43ceaf8e-kube-api-access-gzvtn\") on node \"crc\" DevicePath \"\"" Oct 02 07:30:03 crc kubenswrapper[4786]: I1002 07:30:03.339542 4786 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c9b3b96-57a4-4d81-b973-51bc43ceaf8e-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 07:30:03 crc kubenswrapper[4786]: I1002 07:30:03.816729 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323170-s5spr" event={"ID":"4c9b3b96-57a4-4d81-b973-51bc43ceaf8e","Type":"ContainerDied","Data":"02ba0f60ce34b34fe4a89e27a4de581714a3b591d77c062c1ed75d839953be67"} Oct 02 07:30:03 crc kubenswrapper[4786]: I1002 07:30:03.816763 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02ba0f60ce34b34fe4a89e27a4de581714a3b591d77c062c1ed75d839953be67" Oct 02 07:30:03 crc kubenswrapper[4786]: I1002 07:30:03.816770 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323170-s5spr" Oct 02 07:30:04 crc kubenswrapper[4786]: I1002 07:30:04.112510 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd"] Oct 02 07:30:04 crc kubenswrapper[4786]: I1002 07:30:04.118205 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323125-z9rdd"] Oct 02 07:30:04 crc kubenswrapper[4786]: I1002 07:30:04.187296 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d91d0313-de7a-4ba3-85c6-0b96327f9ee2" path="/var/lib/kubelet/pods/d91d0313-de7a-4ba3-85c6-0b96327f9ee2/volumes" Oct 02 07:30:39 crc kubenswrapper[4786]: I1002 07:30:39.565470 4786 scope.go:117] "RemoveContainer" containerID="8e70dda2fbaf046562f5b647932cdd5df9e7e44b9949e1fc302673642776cfc9" Oct 02 07:30:57 crc kubenswrapper[4786]: I1002 07:30:57.497546 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:30:57 crc kubenswrapper[4786]: I1002 07:30:57.497933 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:31:27 crc kubenswrapper[4786]: I1002 07:31:27.497613 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:31:27 crc kubenswrapper[4786]: I1002 07:31:27.497973 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:31:57 crc kubenswrapper[4786]: I1002 07:31:57.497495 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:31:57 crc kubenswrapper[4786]: I1002 07:31:57.497941 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:31:57 crc kubenswrapper[4786]: I1002 07:31:57.497978 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 07:31:57 crc kubenswrapper[4786]: I1002 07:31:57.498552 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"834d772f522a6b607bc93d02371b22f2e883bf16a4b9df657cb1c41e0c68d418"} pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 07:31:57 crc kubenswrapper[4786]: I1002 07:31:57.498599 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" containerID="cri-o://834d772f522a6b607bc93d02371b22f2e883bf16a4b9df657cb1c41e0c68d418" gracePeriod=600 Oct 02 07:31:58 crc kubenswrapper[4786]: I1002 07:31:58.508418 4786 generic.go:334] "Generic (PLEG): container finished" podID="79cb22df-4930-4aed-9108-1056074d1000" containerID="834d772f522a6b607bc93d02371b22f2e883bf16a4b9df657cb1c41e0c68d418" exitCode=0 Oct 02 07:31:58 crc kubenswrapper[4786]: I1002 07:31:58.508485 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerDied","Data":"834d772f522a6b607bc93d02371b22f2e883bf16a4b9df657cb1c41e0c68d418"} Oct 02 07:31:58 crc kubenswrapper[4786]: I1002 07:31:58.508921 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerStarted","Data":"55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5"} Oct 02 07:31:58 crc kubenswrapper[4786]: I1002 07:31:58.508948 4786 scope.go:117] "RemoveContainer" containerID="5875e1d010ba74d09474cd7e3e02da3988526760a5b4eb1311c1296c4c6ddbb5" Oct 02 07:32:58 crc kubenswrapper[4786]: I1002 07:32:58.890721 4786 generic.go:334] "Generic (PLEG): container finished" podID="2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9" containerID="4e85807243d2fc5c114c6268f0accbd515caa6fd0c9f6cc05ddbfc24ac979a7f" exitCode=0 Oct 02 07:32:58 crc kubenswrapper[4786]: I1002 07:32:58.890793 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9","Type":"ContainerDied","Data":"4e85807243d2fc5c114c6268f0accbd515caa6fd0c9f6cc05ddbfc24ac979a7f"} Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.173864 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.256581 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-openstack-config-secret\") pod \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.256905 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpkjr\" (UniqueName: \"kubernetes.io/projected/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-kube-api-access-hpkjr\") pod \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.256926 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-ca-certs\") pod \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.256954 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.256989 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-test-operator-ephemeral-workdir\") pod \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.257015 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-ssh-key\") pod \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.257054 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-openstack-config\") pod \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.257072 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-config-data\") pod \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.257115 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-test-operator-ephemeral-temporary\") pod \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\" (UID: \"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9\") " Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.257675 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9" (UID: "2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.257928 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-config-data" (OuterVolumeSpecName: "config-data") pod "2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9" (UID: "2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.262166 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9" (UID: "2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.262284 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9" (UID: "2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.262324 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-kube-api-access-hpkjr" (OuterVolumeSpecName: "kube-api-access-hpkjr") pod "2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9" (UID: "2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9"). InnerVolumeSpecName "kube-api-access-hpkjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.278736 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9" (UID: "2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.279328 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9" (UID: "2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.280612 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9" (UID: "2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.294805 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9" (UID: "2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.359880 4786 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.360012 4786 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.360078 4786 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.360133 4786 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.360182 4786 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.360239 4786 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.360309 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpkjr\" (UniqueName: \"kubernetes.io/projected/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-kube-api-access-hpkjr\") on node \"crc\" DevicePath \"\"" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.360362 4786 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.360449 4786 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.380910 4786 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.461770 4786 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.904808 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9","Type":"ContainerDied","Data":"77d9babc089bb332e17913cf1d4f567ecaaca3f8e20714af339acb906e0c0065"} Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.904851 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77d9babc089bb332e17913cf1d4f567ecaaca3f8e20714af339acb906e0c0065" Oct 02 07:33:00 crc kubenswrapper[4786]: I1002 07:33:00.904851 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 02 07:33:04 crc kubenswrapper[4786]: I1002 07:33:04.023907 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 02 07:33:04 crc kubenswrapper[4786]: E1002 07:33:04.025319 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c9b3b96-57a4-4d81-b973-51bc43ceaf8e" containerName="collect-profiles" Oct 02 07:33:04 crc kubenswrapper[4786]: I1002 07:33:04.025403 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c9b3b96-57a4-4d81-b973-51bc43ceaf8e" containerName="collect-profiles" Oct 02 07:33:04 crc kubenswrapper[4786]: E1002 07:33:04.025484 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9" containerName="tempest-tests-tempest-tests-runner" Oct 02 07:33:04 crc kubenswrapper[4786]: I1002 07:33:04.025541 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9" containerName="tempest-tests-tempest-tests-runner" Oct 02 07:33:04 crc kubenswrapper[4786]: I1002 07:33:04.025843 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c9b3b96-57a4-4d81-b973-51bc43ceaf8e" containerName="collect-profiles" Oct 02 07:33:04 crc kubenswrapper[4786]: I1002 07:33:04.025921 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9" containerName="tempest-tests-tempest-tests-runner" Oct 02 07:33:04 crc kubenswrapper[4786]: I1002 07:33:04.026845 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 07:33:04 crc kubenswrapper[4786]: I1002 07:33:04.032770 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-6n4v6" Oct 02 07:33:04 crc kubenswrapper[4786]: I1002 07:33:04.036604 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 02 07:33:04 crc kubenswrapper[4786]: I1002 07:33:04.227054 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9e8dedb-8789-4b6e-9f4d-0f46ae66d27d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 07:33:04 crc kubenswrapper[4786]: I1002 07:33:04.227205 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr2sx\" (UniqueName: \"kubernetes.io/projected/a9e8dedb-8789-4b6e-9f4d-0f46ae66d27d-kube-api-access-dr2sx\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9e8dedb-8789-4b6e-9f4d-0f46ae66d27d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 07:33:04 crc kubenswrapper[4786]: I1002 07:33:04.329118 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr2sx\" (UniqueName: \"kubernetes.io/projected/a9e8dedb-8789-4b6e-9f4d-0f46ae66d27d-kube-api-access-dr2sx\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9e8dedb-8789-4b6e-9f4d-0f46ae66d27d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 07:33:04 crc kubenswrapper[4786]: I1002 07:33:04.329531 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9e8dedb-8789-4b6e-9f4d-0f46ae66d27d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 07:33:04 crc kubenswrapper[4786]: I1002 07:33:04.329976 4786 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9e8dedb-8789-4b6e-9f4d-0f46ae66d27d\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 07:33:04 crc kubenswrapper[4786]: I1002 07:33:04.348502 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr2sx\" (UniqueName: \"kubernetes.io/projected/a9e8dedb-8789-4b6e-9f4d-0f46ae66d27d-kube-api-access-dr2sx\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9e8dedb-8789-4b6e-9f4d-0f46ae66d27d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 07:33:04 crc kubenswrapper[4786]: I1002 07:33:04.348907 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"a9e8dedb-8789-4b6e-9f4d-0f46ae66d27d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 07:33:04 crc kubenswrapper[4786]: I1002 07:33:04.643542 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 02 07:33:05 crc kubenswrapper[4786]: I1002 07:33:05.021507 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 02 07:33:05 crc kubenswrapper[4786]: I1002 07:33:05.026771 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 07:33:05 crc kubenswrapper[4786]: I1002 07:33:05.945148 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"a9e8dedb-8789-4b6e-9f4d-0f46ae66d27d","Type":"ContainerStarted","Data":"c47d30adba7181cb22ece8d298b0c80d31d9312ddaffefa28e96537372f36930"} Oct 02 07:33:06 crc kubenswrapper[4786]: I1002 07:33:06.958273 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"a9e8dedb-8789-4b6e-9f4d-0f46ae66d27d","Type":"ContainerStarted","Data":"0a4f244d9b68a239e685c8a1039e2ab0c151fc6058e4b64c3d39bd7b8fb38e8d"} Oct 02 07:33:06 crc kubenswrapper[4786]: I1002 07:33:06.972673 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.986097092 podStartE2EDuration="2.97262252s" podCreationTimestamp="2025-10-02 07:33:04 +0000 UTC" firstStartedPulling="2025-10-02 07:33:05.026564004 +0000 UTC m=+2795.147747136" lastFinishedPulling="2025-10-02 07:33:06.013089433 +0000 UTC m=+2796.134272564" observedRunningTime="2025-10-02 07:33:06.970484438 +0000 UTC m=+2797.091667579" watchObservedRunningTime="2025-10-02 07:33:06.97262252 +0000 UTC m=+2797.093805650" Oct 02 07:33:19 crc kubenswrapper[4786]: I1002 07:33:19.932366 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7wpts/must-gather-pk9zt"] Oct 02 07:33:19 crc kubenswrapper[4786]: I1002 07:33:19.933983 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wpts/must-gather-pk9zt" Oct 02 07:33:19 crc kubenswrapper[4786]: I1002 07:33:19.936131 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7wpts"/"default-dockercfg-bwh2z" Oct 02 07:33:19 crc kubenswrapper[4786]: I1002 07:33:19.939559 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7wpts"/"openshift-service-ca.crt" Oct 02 07:33:19 crc kubenswrapper[4786]: I1002 07:33:19.939625 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7wpts"/"kube-root-ca.crt" Oct 02 07:33:19 crc kubenswrapper[4786]: I1002 07:33:19.950520 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7wpts/must-gather-pk9zt"] Oct 02 07:33:20 crc kubenswrapper[4786]: I1002 07:33:20.013062 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkpd5\" (UniqueName: \"kubernetes.io/projected/d32bb154-ceb0-4098-b783-4be98ec66836-kube-api-access-dkpd5\") pod \"must-gather-pk9zt\" (UID: \"d32bb154-ceb0-4098-b783-4be98ec66836\") " pod="openshift-must-gather-7wpts/must-gather-pk9zt" Oct 02 07:33:20 crc kubenswrapper[4786]: I1002 07:33:20.013101 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d32bb154-ceb0-4098-b783-4be98ec66836-must-gather-output\") pod \"must-gather-pk9zt\" (UID: \"d32bb154-ceb0-4098-b783-4be98ec66836\") " pod="openshift-must-gather-7wpts/must-gather-pk9zt" Oct 02 07:33:20 crc kubenswrapper[4786]: I1002 07:33:20.114539 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkpd5\" (UniqueName: \"kubernetes.io/projected/d32bb154-ceb0-4098-b783-4be98ec66836-kube-api-access-dkpd5\") pod \"must-gather-pk9zt\" (UID: \"d32bb154-ceb0-4098-b783-4be98ec66836\") " pod="openshift-must-gather-7wpts/must-gather-pk9zt" Oct 02 07:33:20 crc kubenswrapper[4786]: I1002 07:33:20.114577 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d32bb154-ceb0-4098-b783-4be98ec66836-must-gather-output\") pod \"must-gather-pk9zt\" (UID: \"d32bb154-ceb0-4098-b783-4be98ec66836\") " pod="openshift-must-gather-7wpts/must-gather-pk9zt" Oct 02 07:33:20 crc kubenswrapper[4786]: I1002 07:33:20.115063 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d32bb154-ceb0-4098-b783-4be98ec66836-must-gather-output\") pod \"must-gather-pk9zt\" (UID: \"d32bb154-ceb0-4098-b783-4be98ec66836\") " pod="openshift-must-gather-7wpts/must-gather-pk9zt" Oct 02 07:33:20 crc kubenswrapper[4786]: I1002 07:33:20.129342 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkpd5\" (UniqueName: \"kubernetes.io/projected/d32bb154-ceb0-4098-b783-4be98ec66836-kube-api-access-dkpd5\") pod \"must-gather-pk9zt\" (UID: \"d32bb154-ceb0-4098-b783-4be98ec66836\") " pod="openshift-must-gather-7wpts/must-gather-pk9zt" Oct 02 07:33:20 crc kubenswrapper[4786]: I1002 07:33:20.250862 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wpts/must-gather-pk9zt" Oct 02 07:33:20 crc kubenswrapper[4786]: I1002 07:33:20.607455 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7wpts/must-gather-pk9zt"] Oct 02 07:33:21 crc kubenswrapper[4786]: I1002 07:33:21.051018 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7wpts/must-gather-pk9zt" event={"ID":"d32bb154-ceb0-4098-b783-4be98ec66836","Type":"ContainerStarted","Data":"cc239c61831aadb00ba055e17870105120af98617d7474898c9fb46967cc7fcd"} Oct 02 07:33:25 crc kubenswrapper[4786]: I1002 07:33:25.081853 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7wpts/must-gather-pk9zt" event={"ID":"d32bb154-ceb0-4098-b783-4be98ec66836","Type":"ContainerStarted","Data":"59c86f013e6393e14b59836c95a35227b5e124326a75372e2d7f573de08efcb9"} Oct 02 07:33:26 crc kubenswrapper[4786]: I1002 07:33:26.090298 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7wpts/must-gather-pk9zt" event={"ID":"d32bb154-ceb0-4098-b783-4be98ec66836","Type":"ContainerStarted","Data":"b654421e4f34feaf10fedbbfbed93511170fd8e653cf3073af7dd4d5ccd934ae"} Oct 02 07:33:27 crc kubenswrapper[4786]: I1002 07:33:27.538233 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7wpts/must-gather-pk9zt" podStartSLOduration=4.26417568 podStartE2EDuration="8.538217805s" podCreationTimestamp="2025-10-02 07:33:19 +0000 UTC" firstStartedPulling="2025-10-02 07:33:20.611711253 +0000 UTC m=+2810.732894384" lastFinishedPulling="2025-10-02 07:33:24.885753378 +0000 UTC m=+2815.006936509" observedRunningTime="2025-10-02 07:33:26.102541372 +0000 UTC m=+2816.223724513" watchObservedRunningTime="2025-10-02 07:33:27.538217805 +0000 UTC m=+2817.659400936" Oct 02 07:33:27 crc kubenswrapper[4786]: I1002 07:33:27.544505 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7wpts/crc-debug-qzlhf"] Oct 02 07:33:27 crc kubenswrapper[4786]: I1002 07:33:27.545413 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wpts/crc-debug-qzlhf" Oct 02 07:33:27 crc kubenswrapper[4786]: I1002 07:33:27.632568 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67f1c797-1feb-4f08-8211-38d2375023b0-host\") pod \"crc-debug-qzlhf\" (UID: \"67f1c797-1feb-4f08-8211-38d2375023b0\") " pod="openshift-must-gather-7wpts/crc-debug-qzlhf" Oct 02 07:33:27 crc kubenswrapper[4786]: I1002 07:33:27.632904 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kf4l\" (UniqueName: \"kubernetes.io/projected/67f1c797-1feb-4f08-8211-38d2375023b0-kube-api-access-9kf4l\") pod \"crc-debug-qzlhf\" (UID: \"67f1c797-1feb-4f08-8211-38d2375023b0\") " pod="openshift-must-gather-7wpts/crc-debug-qzlhf" Oct 02 07:33:27 crc kubenswrapper[4786]: I1002 07:33:27.735064 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67f1c797-1feb-4f08-8211-38d2375023b0-host\") pod \"crc-debug-qzlhf\" (UID: \"67f1c797-1feb-4f08-8211-38d2375023b0\") " pod="openshift-must-gather-7wpts/crc-debug-qzlhf" Oct 02 07:33:27 crc kubenswrapper[4786]: I1002 07:33:27.735202 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67f1c797-1feb-4f08-8211-38d2375023b0-host\") pod \"crc-debug-qzlhf\" (UID: \"67f1c797-1feb-4f08-8211-38d2375023b0\") " pod="openshift-must-gather-7wpts/crc-debug-qzlhf" Oct 02 07:33:27 crc kubenswrapper[4786]: I1002 07:33:27.735348 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kf4l\" (UniqueName: \"kubernetes.io/projected/67f1c797-1feb-4f08-8211-38d2375023b0-kube-api-access-9kf4l\") pod \"crc-debug-qzlhf\" (UID: \"67f1c797-1feb-4f08-8211-38d2375023b0\") " pod="openshift-must-gather-7wpts/crc-debug-qzlhf" Oct 02 07:33:27 crc kubenswrapper[4786]: I1002 07:33:27.754007 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kf4l\" (UniqueName: \"kubernetes.io/projected/67f1c797-1feb-4f08-8211-38d2375023b0-kube-api-access-9kf4l\") pod \"crc-debug-qzlhf\" (UID: \"67f1c797-1feb-4f08-8211-38d2375023b0\") " pod="openshift-must-gather-7wpts/crc-debug-qzlhf" Oct 02 07:33:27 crc kubenswrapper[4786]: I1002 07:33:27.861640 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wpts/crc-debug-qzlhf" Oct 02 07:33:27 crc kubenswrapper[4786]: W1002 07:33:27.887353 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67f1c797_1feb_4f08_8211_38d2375023b0.slice/crio-315ae074bcb2490ed565a6b5037573d7fb9c1db64c576ea5b08ddd14a5dab8d2 WatchSource:0}: Error finding container 315ae074bcb2490ed565a6b5037573d7fb9c1db64c576ea5b08ddd14a5dab8d2: Status 404 returned error can't find the container with id 315ae074bcb2490ed565a6b5037573d7fb9c1db64c576ea5b08ddd14a5dab8d2 Oct 02 07:33:28 crc kubenswrapper[4786]: I1002 07:33:28.102439 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7wpts/crc-debug-qzlhf" event={"ID":"67f1c797-1feb-4f08-8211-38d2375023b0","Type":"ContainerStarted","Data":"315ae074bcb2490ed565a6b5037573d7fb9c1db64c576ea5b08ddd14a5dab8d2"} Oct 02 07:33:37 crc kubenswrapper[4786]: I1002 07:33:37.176988 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7wpts/crc-debug-qzlhf" podStartSLOduration=1.06486824 podStartE2EDuration="10.176974446s" podCreationTimestamp="2025-10-02 07:33:27 +0000 UTC" firstStartedPulling="2025-10-02 07:33:27.888932104 +0000 UTC m=+2818.010115235" lastFinishedPulling="2025-10-02 07:33:37.00103831 +0000 UTC m=+2827.122221441" observedRunningTime="2025-10-02 07:33:37.174152325 +0000 UTC m=+2827.295335456" watchObservedRunningTime="2025-10-02 07:33:37.176974446 +0000 UTC m=+2827.298157577" Oct 02 07:33:38 crc kubenswrapper[4786]: I1002 07:33:38.171468 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7wpts/crc-debug-qzlhf" event={"ID":"67f1c797-1feb-4f08-8211-38d2375023b0","Type":"ContainerStarted","Data":"be346b664ec44ba5400310bd5c21c62d9eb187f0b9397c405dc0153ce3ec446f"} Oct 02 07:33:57 crc kubenswrapper[4786]: I1002 07:33:57.497125 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:33:57 crc kubenswrapper[4786]: I1002 07:33:57.497492 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:34:10 crc kubenswrapper[4786]: I1002 07:34:10.357722 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6c4f87b654-hv865_ed22a27c-dae6-448c-b789-85add05aff31/barbican-api-log/0.log" Oct 02 07:34:10 crc kubenswrapper[4786]: I1002 07:34:10.380014 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6c4f87b654-hv865_ed22a27c-dae6-448c-b789-85add05aff31/barbican-api/0.log" Oct 02 07:34:10 crc kubenswrapper[4786]: I1002 07:34:10.547437 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-755cbf448b-g8s9l_812267df-c387-42a6-a6b4-758beccdd77d/barbican-keystone-listener/0.log" Oct 02 07:34:10 crc kubenswrapper[4786]: I1002 07:34:10.573608 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-755cbf448b-g8s9l_812267df-c387-42a6-a6b4-758beccdd77d/barbican-keystone-listener-log/0.log" Oct 02 07:34:10 crc kubenswrapper[4786]: I1002 07:34:10.691684 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5fc459f649-l589x_af54426b-d853-4806-bdf8-c1fd22cb6752/barbican-worker/0.log" Oct 02 07:34:10 crc kubenswrapper[4786]: I1002 07:34:10.734546 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5fc459f649-l589x_af54426b-d853-4806-bdf8-c1fd22cb6752/barbican-worker-log/0.log" Oct 02 07:34:10 crc kubenswrapper[4786]: I1002 07:34:10.858888 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf_2bd6d4c9-950c-4051-a20e-cfae8655dab2/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:34:11 crc kubenswrapper[4786]: I1002 07:34:11.038291 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2bebe6de-020d-4c9d-b4ea-a3069dace1c8/ceilometer-notification-agent/0.log" Oct 02 07:34:11 crc kubenswrapper[4786]: I1002 07:34:11.061105 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2bebe6de-020d-4c9d-b4ea-a3069dace1c8/ceilometer-central-agent/0.log" Oct 02 07:34:11 crc kubenswrapper[4786]: I1002 07:34:11.107546 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2bebe6de-020d-4c9d-b4ea-a3069dace1c8/proxy-httpd/0.log" Oct 02 07:34:11 crc kubenswrapper[4786]: I1002 07:34:11.216582 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2bebe6de-020d-4c9d-b4ea-a3069dace1c8/sg-core/0.log" Oct 02 07:34:11 crc kubenswrapper[4786]: I1002 07:34:11.255275 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_546a4a1f-7f71-4714-8d8c-a012948427bb/cinder-api/0.log" Oct 02 07:34:11 crc kubenswrapper[4786]: I1002 07:34:11.343878 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_546a4a1f-7f71-4714-8d8c-a012948427bb/cinder-api-log/0.log" Oct 02 07:34:11 crc kubenswrapper[4786]: I1002 07:34:11.406998 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fda2a685-4713-46a8-9630-6d9d70f80bf7/cinder-scheduler/0.log" Oct 02 07:34:11 crc kubenswrapper[4786]: I1002 07:34:11.516561 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fda2a685-4713-46a8-9630-6d9d70f80bf7/probe/0.log" Oct 02 07:34:11 crc kubenswrapper[4786]: I1002 07:34:11.578817 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq_2747d6cd-de52-43b7-a2d0-16c86deecd42/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:34:11 crc kubenswrapper[4786]: I1002 07:34:11.737898 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp_169def9f-29e2-41fb-bf34-86464f366256/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:34:11 crc kubenswrapper[4786]: I1002 07:34:11.867048 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g_d3db3504-6495-465c-8c96-90b80bdcb97e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:34:11 crc kubenswrapper[4786]: I1002 07:34:11.910496 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-fdcbb4567-8fktd_1fc331cd-16f8-41c1-8a54-7259f7c5fecb/init/0.log" Oct 02 07:34:12 crc kubenswrapper[4786]: I1002 07:34:12.055872 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-fdcbb4567-8fktd_1fc331cd-16f8-41c1-8a54-7259f7c5fecb/init/0.log" Oct 02 07:34:12 crc kubenswrapper[4786]: I1002 07:34:12.105548 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-fdcbb4567-8fktd_1fc331cd-16f8-41c1-8a54-7259f7c5fecb/dnsmasq-dns/0.log" Oct 02 07:34:12 crc kubenswrapper[4786]: I1002 07:34:12.236492 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw_aa4315e2-27f5-4b58-91a6-1d5b683c6e55/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:34:12 crc kubenswrapper[4786]: I1002 07:34:12.294373 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cbc5c046-7b05-4041-bb1f-a9851dde1c79/glance-httpd/0.log" Oct 02 07:34:12 crc kubenswrapper[4786]: I1002 07:34:12.401993 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cbc5c046-7b05-4041-bb1f-a9851dde1c79/glance-log/0.log" Oct 02 07:34:12 crc kubenswrapper[4786]: I1002 07:34:12.477407 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_326311ac-c390-46ad-bdd7-29ce60f094bc/glance-httpd/0.log" Oct 02 07:34:12 crc kubenswrapper[4786]: I1002 07:34:12.543420 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_326311ac-c390-46ad-bdd7-29ce60f094bc/glance-log/0.log" Oct 02 07:34:12 crc kubenswrapper[4786]: I1002 07:34:12.681644 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6_2b54f55b-d53c-449b-a2e3-6ca66ae19657/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:34:12 crc kubenswrapper[4786]: I1002 07:34:12.818527 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-7r8j5_4267f5c8-feae-4e65-a80a-ebb4c7003eaf/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:34:13 crc kubenswrapper[4786]: I1002 07:34:13.106825 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e002986c-0138-4b99-98a9-3d2095810fb4/kube-state-metrics/0.log" Oct 02 07:34:13 crc kubenswrapper[4786]: I1002 07:34:13.130823 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-749d95758-7hp9f_1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7/keystone-api/0.log" Oct 02 07:34:13 crc kubenswrapper[4786]: I1002 07:34:13.250589 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7_f597e524-6913-4445-ac68-00fb20d044b8/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:34:13 crc kubenswrapper[4786]: I1002 07:34:13.518123 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5c985b58cc-mrm2m_89685228-db75-441e-83ae-74720db4de72/neutron-httpd/0.log" Oct 02 07:34:13 crc kubenswrapper[4786]: I1002 07:34:13.552312 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5c985b58cc-mrm2m_89685228-db75-441e-83ae-74720db4de72/neutron-api/0.log" Oct 02 07:34:13 crc kubenswrapper[4786]: I1002 07:34:13.684642 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt_5e5e53e7-0f7d-4d7a-b410-364982cf5311/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:34:14 crc kubenswrapper[4786]: I1002 07:34:14.121947 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5925ec23-28fa-42cb-9f3b-aa96d2efab12/nova-api-log/0.log" Oct 02 07:34:14 crc kubenswrapper[4786]: I1002 07:34:14.163128 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5925ec23-28fa-42cb-9f3b-aa96d2efab12/nova-api-api/0.log" Oct 02 07:34:14 crc kubenswrapper[4786]: I1002 07:34:14.213902 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lj274"] Oct 02 07:34:14 crc kubenswrapper[4786]: I1002 07:34:14.216406 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lj274" Oct 02 07:34:14 crc kubenswrapper[4786]: I1002 07:34:14.223073 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lj274"] Oct 02 07:34:14 crc kubenswrapper[4786]: I1002 07:34:14.263241 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_6ca74ccc-e756-4687-ad8e-7ffecd4b92f7/nova-cell0-conductor-conductor/0.log" Oct 02 07:34:14 crc kubenswrapper[4786]: I1002 07:34:14.300961 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2240fee8-3d4b-45e9-b7a1-242d5102f56e-catalog-content\") pod \"community-operators-lj274\" (UID: \"2240fee8-3d4b-45e9-b7a1-242d5102f56e\") " pod="openshift-marketplace/community-operators-lj274" Oct 02 07:34:14 crc kubenswrapper[4786]: I1002 07:34:14.301037 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2240fee8-3d4b-45e9-b7a1-242d5102f56e-utilities\") pod \"community-operators-lj274\" (UID: \"2240fee8-3d4b-45e9-b7a1-242d5102f56e\") " pod="openshift-marketplace/community-operators-lj274" Oct 02 07:34:14 crc kubenswrapper[4786]: I1002 07:34:14.301100 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw9km\" (UniqueName: \"kubernetes.io/projected/2240fee8-3d4b-45e9-b7a1-242d5102f56e-kube-api-access-hw9km\") pod \"community-operators-lj274\" (UID: \"2240fee8-3d4b-45e9-b7a1-242d5102f56e\") " pod="openshift-marketplace/community-operators-lj274" Oct 02 07:34:14 crc kubenswrapper[4786]: I1002 07:34:14.403027 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2240fee8-3d4b-45e9-b7a1-242d5102f56e-catalog-content\") pod \"community-operators-lj274\" (UID: \"2240fee8-3d4b-45e9-b7a1-242d5102f56e\") " pod="openshift-marketplace/community-operators-lj274" Oct 02 07:34:14 crc kubenswrapper[4786]: I1002 07:34:14.403122 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2240fee8-3d4b-45e9-b7a1-242d5102f56e-utilities\") pod \"community-operators-lj274\" (UID: \"2240fee8-3d4b-45e9-b7a1-242d5102f56e\") " pod="openshift-marketplace/community-operators-lj274" Oct 02 07:34:14 crc kubenswrapper[4786]: I1002 07:34:14.403173 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw9km\" (UniqueName: \"kubernetes.io/projected/2240fee8-3d4b-45e9-b7a1-242d5102f56e-kube-api-access-hw9km\") pod \"community-operators-lj274\" (UID: \"2240fee8-3d4b-45e9-b7a1-242d5102f56e\") " pod="openshift-marketplace/community-operators-lj274" Oct 02 07:34:14 crc kubenswrapper[4786]: I1002 07:34:14.403803 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2240fee8-3d4b-45e9-b7a1-242d5102f56e-catalog-content\") pod \"community-operators-lj274\" (UID: \"2240fee8-3d4b-45e9-b7a1-242d5102f56e\") " pod="openshift-marketplace/community-operators-lj274" Oct 02 07:34:14 crc kubenswrapper[4786]: I1002 07:34:14.403847 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2240fee8-3d4b-45e9-b7a1-242d5102f56e-utilities\") pod \"community-operators-lj274\" (UID: \"2240fee8-3d4b-45e9-b7a1-242d5102f56e\") " pod="openshift-marketplace/community-operators-lj274" Oct 02 07:34:14 crc kubenswrapper[4786]: I1002 07:34:14.428246 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw9km\" (UniqueName: \"kubernetes.io/projected/2240fee8-3d4b-45e9-b7a1-242d5102f56e-kube-api-access-hw9km\") pod \"community-operators-lj274\" (UID: \"2240fee8-3d4b-45e9-b7a1-242d5102f56e\") " pod="openshift-marketplace/community-operators-lj274" Oct 02 07:34:14 crc kubenswrapper[4786]: I1002 07:34:14.456944 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9c68bf5e-fcbd-48d6-bcfb-f36d68381c09/nova-cell1-conductor-conductor/0.log" Oct 02 07:34:14 crc kubenswrapper[4786]: I1002 07:34:14.539864 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lj274" Oct 02 07:34:14 crc kubenswrapper[4786]: I1002 07:34:14.841904 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_841f7177-db9e-4e73-a7b4-ccf30d1538fb/nova-cell1-novncproxy-novncproxy/0.log" Oct 02 07:34:15 crc kubenswrapper[4786]: I1002 07:34:15.004266 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-qnbsm_0378af36-7eb4-4ef7-bf6a-dc2bd678c31e/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:34:15 crc kubenswrapper[4786]: I1002 07:34:15.068185 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_15256076-c838-461c-98d9-4ae4883be465/nova-metadata-log/0.log" Oct 02 07:34:15 crc kubenswrapper[4786]: I1002 07:34:15.138179 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lj274"] Oct 02 07:34:15 crc kubenswrapper[4786]: I1002 07:34:15.426150 4786 generic.go:334] "Generic (PLEG): container finished" podID="2240fee8-3d4b-45e9-b7a1-242d5102f56e" containerID="b812d1d0a8629bcbde3af93dd269c89757668c67a639b9d34998596a8f2c2ffd" exitCode=0 Oct 02 07:34:15 crc kubenswrapper[4786]: I1002 07:34:15.426350 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lj274" event={"ID":"2240fee8-3d4b-45e9-b7a1-242d5102f56e","Type":"ContainerDied","Data":"b812d1d0a8629bcbde3af93dd269c89757668c67a639b9d34998596a8f2c2ffd"} Oct 02 07:34:15 crc kubenswrapper[4786]: I1002 07:34:15.426373 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lj274" event={"ID":"2240fee8-3d4b-45e9-b7a1-242d5102f56e","Type":"ContainerStarted","Data":"1fc4076f91b94b518bf832affdbb33d09adc58eca6b912e7897faee2f6248ae0"} Oct 02 07:34:15 crc kubenswrapper[4786]: I1002 07:34:15.435488 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_945deef9-0f62-4eeb-af6c-beb702d5458b/nova-scheduler-scheduler/0.log" Oct 02 07:34:15 crc kubenswrapper[4786]: I1002 07:34:15.594902 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e4c62a8c-eb69-4f97-a299-a44f87315f81/mysql-bootstrap/0.log" Oct 02 07:34:15 crc kubenswrapper[4786]: I1002 07:34:15.770520 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e4c62a8c-eb69-4f97-a299-a44f87315f81/mysql-bootstrap/0.log" Oct 02 07:34:15 crc kubenswrapper[4786]: I1002 07:34:15.796371 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e4c62a8c-eb69-4f97-a299-a44f87315f81/galera/0.log" Oct 02 07:34:15 crc kubenswrapper[4786]: I1002 07:34:15.980968 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_15256076-c838-461c-98d9-4ae4883be465/nova-metadata-metadata/0.log" Oct 02 07:34:16 crc kubenswrapper[4786]: I1002 07:34:16.000015 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_106dd908-3398-489e-a39f-b684b5eecd2b/mysql-bootstrap/0.log" Oct 02 07:34:16 crc kubenswrapper[4786]: I1002 07:34:16.209386 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_106dd908-3398-489e-a39f-b684b5eecd2b/mysql-bootstrap/0.log" Oct 02 07:34:16 crc kubenswrapper[4786]: I1002 07:34:16.218631 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_106dd908-3398-489e-a39f-b684b5eecd2b/galera/0.log" Oct 02 07:34:16 crc kubenswrapper[4786]: I1002 07:34:16.391855 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_f57411f4-01bc-4ecc-a240-9101e861f97d/openstackclient/0.log" Oct 02 07:34:16 crc kubenswrapper[4786]: I1002 07:34:16.462291 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-gl77n_d0b71af7-f7f4-45e8-b8f0-c7428f54a37d/ovn-controller/0.log" Oct 02 07:34:16 crc kubenswrapper[4786]: I1002 07:34:16.777721 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-jz66d_003620a3-92b9-4640-be9e-9a8b064fc888/openstack-network-exporter/0.log" Oct 02 07:34:16 crc kubenswrapper[4786]: I1002 07:34:16.943660 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lf7tj_452e4743-083e-420b-9dfc-ea81e1376373/ovsdb-server-init/0.log" Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.012615 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jkm8t"] Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.014373 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jkm8t" Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.027427 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jkm8t"] Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.054617 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5hdf\" (UniqueName: \"kubernetes.io/projected/7aa8afa9-2afe-4918-99a0-367734537c98-kube-api-access-n5hdf\") pod \"certified-operators-jkm8t\" (UID: \"7aa8afa9-2afe-4918-99a0-367734537c98\") " pod="openshift-marketplace/certified-operators-jkm8t" Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.054655 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa8afa9-2afe-4918-99a0-367734537c98-utilities\") pod \"certified-operators-jkm8t\" (UID: \"7aa8afa9-2afe-4918-99a0-367734537c98\") " pod="openshift-marketplace/certified-operators-jkm8t" Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.054736 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa8afa9-2afe-4918-99a0-367734537c98-catalog-content\") pod \"certified-operators-jkm8t\" (UID: \"7aa8afa9-2afe-4918-99a0-367734537c98\") " pod="openshift-marketplace/certified-operators-jkm8t" Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.158478 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5hdf\" (UniqueName: \"kubernetes.io/projected/7aa8afa9-2afe-4918-99a0-367734537c98-kube-api-access-n5hdf\") pod \"certified-operators-jkm8t\" (UID: \"7aa8afa9-2afe-4918-99a0-367734537c98\") " pod="openshift-marketplace/certified-operators-jkm8t" Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.158526 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa8afa9-2afe-4918-99a0-367734537c98-utilities\") pod \"certified-operators-jkm8t\" (UID: \"7aa8afa9-2afe-4918-99a0-367734537c98\") " pod="openshift-marketplace/certified-operators-jkm8t" Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.158671 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa8afa9-2afe-4918-99a0-367734537c98-catalog-content\") pod \"certified-operators-jkm8t\" (UID: \"7aa8afa9-2afe-4918-99a0-367734537c98\") " pod="openshift-marketplace/certified-operators-jkm8t" Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.159263 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa8afa9-2afe-4918-99a0-367734537c98-catalog-content\") pod \"certified-operators-jkm8t\" (UID: \"7aa8afa9-2afe-4918-99a0-367734537c98\") " pod="openshift-marketplace/certified-operators-jkm8t" Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.159760 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa8afa9-2afe-4918-99a0-367734537c98-utilities\") pod \"certified-operators-jkm8t\" (UID: \"7aa8afa9-2afe-4918-99a0-367734537c98\") " pod="openshift-marketplace/certified-operators-jkm8t" Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.177402 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5hdf\" (UniqueName: \"kubernetes.io/projected/7aa8afa9-2afe-4918-99a0-367734537c98-kube-api-access-n5hdf\") pod \"certified-operators-jkm8t\" (UID: \"7aa8afa9-2afe-4918-99a0-367734537c98\") " pod="openshift-marketplace/certified-operators-jkm8t" Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.178424 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lf7tj_452e4743-083e-420b-9dfc-ea81e1376373/ovs-vswitchd/0.log" Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.213979 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lf7tj_452e4743-083e-420b-9dfc-ea81e1376373/ovsdb-server-init/0.log" Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.273915 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lf7tj_452e4743-083e-420b-9dfc-ea81e1376373/ovsdb-server/0.log" Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.347990 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jkm8t" Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.699536 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-47j56_483f48a9-eb90-4a5b-aaa6-63e130859f16/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.837349 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jkm8t"] Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.924132 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e0006e64-1f5a-4050-83f6-36ce34d68bf7/openstack-network-exporter/0.log" Oct 02 07:34:17 crc kubenswrapper[4786]: I1002 07:34:17.967317 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e0006e64-1f5a-4050-83f6-36ce34d68bf7/ovn-northd/0.log" Oct 02 07:34:18 crc kubenswrapper[4786]: I1002 07:34:18.142510 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9/ovsdbserver-nb/0.log" Oct 02 07:34:18 crc kubenswrapper[4786]: I1002 07:34:18.153832 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9/openstack-network-exporter/0.log" Oct 02 07:34:18 crc kubenswrapper[4786]: I1002 07:34:18.316252 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9edb2cb3-f47c-4f56-8181-1ec8f9d774f6/openstack-network-exporter/0.log" Oct 02 07:34:18 crc kubenswrapper[4786]: I1002 07:34:18.367453 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9edb2cb3-f47c-4f56-8181-1ec8f9d774f6/ovsdbserver-sb/0.log" Oct 02 07:34:18 crc kubenswrapper[4786]: I1002 07:34:18.461211 4786 generic.go:334] "Generic (PLEG): container finished" podID="7aa8afa9-2afe-4918-99a0-367734537c98" containerID="2f5786ce6456ff72f50ef7ea365a8a1a81ba095a03e3eca328acaf30db46002b" exitCode=0 Oct 02 07:34:18 crc kubenswrapper[4786]: I1002 07:34:18.461249 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jkm8t" event={"ID":"7aa8afa9-2afe-4918-99a0-367734537c98","Type":"ContainerDied","Data":"2f5786ce6456ff72f50ef7ea365a8a1a81ba095a03e3eca328acaf30db46002b"} Oct 02 07:34:18 crc kubenswrapper[4786]: I1002 07:34:18.461273 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jkm8t" event={"ID":"7aa8afa9-2afe-4918-99a0-367734537c98","Type":"ContainerStarted","Data":"151f5db54e68bdae607e11ecab79752317fbdbfbf943eec2606a3bb24238f6c3"} Oct 02 07:34:18 crc kubenswrapper[4786]: I1002 07:34:18.541702 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8c555dcd8-dbpk5_07a06aef-d1a6-4093-ac4e-d6aa3ded6b60/placement-api/0.log" Oct 02 07:34:18 crc kubenswrapper[4786]: I1002 07:34:18.611281 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8c555dcd8-dbpk5_07a06aef-d1a6-4093-ac4e-d6aa3ded6b60/placement-log/0.log" Oct 02 07:34:18 crc kubenswrapper[4786]: I1002 07:34:18.752823 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_00373ed3-3b08-4040-9e05-fd042f541af6/setup-container/0.log" Oct 02 07:34:18 crc kubenswrapper[4786]: I1002 07:34:18.980330 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_00373ed3-3b08-4040-9e05-fd042f541af6/setup-container/0.log" Oct 02 07:34:19 crc kubenswrapper[4786]: I1002 07:34:19.004532 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_00373ed3-3b08-4040-9e05-fd042f541af6/rabbitmq/0.log" Oct 02 07:34:19 crc kubenswrapper[4786]: I1002 07:34:19.200305 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_581ccdf3-7a38-4bb7-93ac-035207098fb7/setup-container/0.log" Oct 02 07:34:19 crc kubenswrapper[4786]: I1002 07:34:19.381095 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_581ccdf3-7a38-4bb7-93ac-035207098fb7/rabbitmq/0.log" Oct 02 07:34:19 crc kubenswrapper[4786]: I1002 07:34:19.382148 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_581ccdf3-7a38-4bb7-93ac-035207098fb7/setup-container/0.log" Oct 02 07:34:19 crc kubenswrapper[4786]: I1002 07:34:19.492176 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jkm8t" event={"ID":"7aa8afa9-2afe-4918-99a0-367734537c98","Type":"ContainerStarted","Data":"0dcc9c60bbf211537ad9acbb6d8c03d985921a9d7661358cbbc4afd13d912b8c"} Oct 02 07:34:19 crc kubenswrapper[4786]: I1002 07:34:19.567191 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf_1f6f8032-e0d0-460f-bc14-653a74481964/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:34:19 crc kubenswrapper[4786]: I1002 07:34:19.713923 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-jfhsk_daf8cf99-f61d-4fc6-bc92-045acafed529/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:34:19 crc kubenswrapper[4786]: I1002 07:34:19.783707 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-c879k_5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:34:19 crc kubenswrapper[4786]: I1002 07:34:19.952162 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-8hhjl_3fe8db1d-ac2c-4028-a843-a74d8e787543/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:34:20 crc kubenswrapper[4786]: I1002 07:34:20.088177 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-ck7z4_d54f4fa9-80d2-47dc-a156-b26cbc9ebd84/ssh-known-hosts-edpm-deployment/0.log" Oct 02 07:34:20 crc kubenswrapper[4786]: I1002 07:34:20.448286 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7dd886698c-blkwg_45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a/proxy-server/0.log" Oct 02 07:34:20 crc kubenswrapper[4786]: I1002 07:34:20.454913 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7dd886698c-blkwg_45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a/proxy-httpd/0.log" Oct 02 07:34:20 crc kubenswrapper[4786]: I1002 07:34:20.504371 4786 generic.go:334] "Generic (PLEG): container finished" podID="7aa8afa9-2afe-4918-99a0-367734537c98" containerID="0dcc9c60bbf211537ad9acbb6d8c03d985921a9d7661358cbbc4afd13d912b8c" exitCode=0 Oct 02 07:34:20 crc kubenswrapper[4786]: I1002 07:34:20.504415 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jkm8t" event={"ID":"7aa8afa9-2afe-4918-99a0-367734537c98","Type":"ContainerDied","Data":"0dcc9c60bbf211537ad9acbb6d8c03d985921a9d7661358cbbc4afd13d912b8c"} Oct 02 07:34:20 crc kubenswrapper[4786]: I1002 07:34:20.625946 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mjfnf_009aff59-5299-4ae9-a997-99c88f103a92/swift-ring-rebalance/0.log" Oct 02 07:34:20 crc kubenswrapper[4786]: I1002 07:34:20.764072 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/account-auditor/0.log" Oct 02 07:34:20 crc kubenswrapper[4786]: I1002 07:34:20.794915 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/account-reaper/0.log" Oct 02 07:34:20 crc kubenswrapper[4786]: I1002 07:34:20.855092 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/account-replicator/0.log" Oct 02 07:34:20 crc kubenswrapper[4786]: I1002 07:34:20.987342 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/container-auditor/0.log" Oct 02 07:34:21 crc kubenswrapper[4786]: I1002 07:34:21.001468 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/account-server/0.log" Oct 02 07:34:21 crc kubenswrapper[4786]: I1002 07:34:21.048373 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/container-replicator/0.log" Oct 02 07:34:21 crc kubenswrapper[4786]: I1002 07:34:21.144851 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/container-server/0.log" Oct 02 07:34:21 crc kubenswrapper[4786]: I1002 07:34:21.202116 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/container-updater/0.log" Oct 02 07:34:21 crc kubenswrapper[4786]: I1002 07:34:21.233951 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/object-auditor/0.log" Oct 02 07:34:21 crc kubenswrapper[4786]: I1002 07:34:21.323745 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/object-expirer/0.log" Oct 02 07:34:21 crc kubenswrapper[4786]: I1002 07:34:21.398489 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/object-replicator/0.log" Oct 02 07:34:21 crc kubenswrapper[4786]: I1002 07:34:21.400103 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/object-server/0.log" Oct 02 07:34:21 crc kubenswrapper[4786]: I1002 07:34:21.546370 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/object-updater/0.log" Oct 02 07:34:21 crc kubenswrapper[4786]: I1002 07:34:21.574213 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/swift-recon-cron/0.log" Oct 02 07:34:21 crc kubenswrapper[4786]: I1002 07:34:21.589879 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/rsync/0.log" Oct 02 07:34:21 crc kubenswrapper[4786]: I1002 07:34:21.819662 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm_b470bf8f-a980-45f7-b488-eda5005abf20/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:34:21 crc kubenswrapper[4786]: I1002 07:34:21.913887 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9/tempest-tests-tempest-tests-runner/0.log" Oct 02 07:34:21 crc kubenswrapper[4786]: I1002 07:34:21.979648 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_a9e8dedb-8789-4b6e-9f4d-0f46ae66d27d/test-operator-logs-container/0.log" Oct 02 07:34:22 crc kubenswrapper[4786]: I1002 07:34:22.153963 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8_18524406-101e-474c-853f-5674430d613f/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:34:24 crc kubenswrapper[4786]: I1002 07:34:24.558871 4786 generic.go:334] "Generic (PLEG): container finished" podID="2240fee8-3d4b-45e9-b7a1-242d5102f56e" containerID="18bd09b63947d5e80d1f812afd447e84f5140b3531e4f16c3dfc075ac6aab67d" exitCode=0 Oct 02 07:34:24 crc kubenswrapper[4786]: I1002 07:34:24.559441 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lj274" event={"ID":"2240fee8-3d4b-45e9-b7a1-242d5102f56e","Type":"ContainerDied","Data":"18bd09b63947d5e80d1f812afd447e84f5140b3531e4f16c3dfc075ac6aab67d"} Oct 02 07:34:25 crc kubenswrapper[4786]: I1002 07:34:25.571358 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lj274" event={"ID":"2240fee8-3d4b-45e9-b7a1-242d5102f56e","Type":"ContainerStarted","Data":"10b0042de8ed8e28adf251518c3eea2a8d07db6adbdeb666f1a9fd314458cad2"} Oct 02 07:34:25 crc kubenswrapper[4786]: I1002 07:34:25.576823 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jkm8t" event={"ID":"7aa8afa9-2afe-4918-99a0-367734537c98","Type":"ContainerStarted","Data":"b78e9ccff38753ca5b0b3ca784f3733a1a2d1ee4dcadcaa52aff7603640c3c47"} Oct 02 07:34:25 crc kubenswrapper[4786]: I1002 07:34:25.591533 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lj274" podStartSLOduration=1.945200464 podStartE2EDuration="11.591519098s" podCreationTimestamp="2025-10-02 07:34:14 +0000 UTC" firstStartedPulling="2025-10-02 07:34:15.427661359 +0000 UTC m=+2865.548844490" lastFinishedPulling="2025-10-02 07:34:25.073979993 +0000 UTC m=+2875.195163124" observedRunningTime="2025-10-02 07:34:25.590810531 +0000 UTC m=+2875.711993672" watchObservedRunningTime="2025-10-02 07:34:25.591519098 +0000 UTC m=+2875.712702229" Oct 02 07:34:27 crc kubenswrapper[4786]: I1002 07:34:27.348640 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jkm8t" Oct 02 07:34:27 crc kubenswrapper[4786]: I1002 07:34:27.349079 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jkm8t" Oct 02 07:34:27 crc kubenswrapper[4786]: I1002 07:34:27.496850 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:34:27 crc kubenswrapper[4786]: I1002 07:34:27.496900 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:34:27 crc kubenswrapper[4786]: I1002 07:34:27.626754 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_26e71aaf-fc49-4218-b001-433de642f9ae/memcached/0.log" Oct 02 07:34:28 crc kubenswrapper[4786]: I1002 07:34:28.387840 4786 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-jkm8t" podUID="7aa8afa9-2afe-4918-99a0-367734537c98" containerName="registry-server" probeResult="failure" output=< Oct 02 07:34:28 crc kubenswrapper[4786]: timeout: failed to connect service ":50051" within 1s Oct 02 07:34:28 crc kubenswrapper[4786]: > Oct 02 07:34:34 crc kubenswrapper[4786]: I1002 07:34:34.540752 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lj274" Oct 02 07:34:34 crc kubenswrapper[4786]: I1002 07:34:34.541154 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lj274" Oct 02 07:34:34 crc kubenswrapper[4786]: I1002 07:34:34.576012 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lj274" Oct 02 07:34:34 crc kubenswrapper[4786]: I1002 07:34:34.591659 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jkm8t" podStartSLOduration=12.250186491000001 podStartE2EDuration="18.591647648s" podCreationTimestamp="2025-10-02 07:34:16 +0000 UTC" firstStartedPulling="2025-10-02 07:34:18.463827581 +0000 UTC m=+2868.585010713" lastFinishedPulling="2025-10-02 07:34:24.805288738 +0000 UTC m=+2874.926471870" observedRunningTime="2025-10-02 07:34:25.610857321 +0000 UTC m=+2875.732040462" watchObservedRunningTime="2025-10-02 07:34:34.591647648 +0000 UTC m=+2884.712830779" Oct 02 07:34:34 crc kubenswrapper[4786]: I1002 07:34:34.709603 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lj274" Oct 02 07:34:34 crc kubenswrapper[4786]: I1002 07:34:34.775323 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lj274"] Oct 02 07:34:34 crc kubenswrapper[4786]: I1002 07:34:34.805024 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kk5wh"] Oct 02 07:34:34 crc kubenswrapper[4786]: I1002 07:34:34.805230 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kk5wh" podUID="a3a03859-c965-4381-ae79-03adc9b0e700" containerName="registry-server" containerID="cri-o://88d6ffbd2c5f74504a540963fc4415571b05bd51298705624b91104eac50b103" gracePeriod=2 Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.197339 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk5wh" Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.279329 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw5bg\" (UniqueName: \"kubernetes.io/projected/a3a03859-c965-4381-ae79-03adc9b0e700-kube-api-access-pw5bg\") pod \"a3a03859-c965-4381-ae79-03adc9b0e700\" (UID: \"a3a03859-c965-4381-ae79-03adc9b0e700\") " Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.279461 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3a03859-c965-4381-ae79-03adc9b0e700-catalog-content\") pod \"a3a03859-c965-4381-ae79-03adc9b0e700\" (UID: \"a3a03859-c965-4381-ae79-03adc9b0e700\") " Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.279512 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3a03859-c965-4381-ae79-03adc9b0e700-utilities\") pod \"a3a03859-c965-4381-ae79-03adc9b0e700\" (UID: \"a3a03859-c965-4381-ae79-03adc9b0e700\") " Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.280107 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3a03859-c965-4381-ae79-03adc9b0e700-utilities" (OuterVolumeSpecName: "utilities") pod "a3a03859-c965-4381-ae79-03adc9b0e700" (UID: "a3a03859-c965-4381-ae79-03adc9b0e700"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.280215 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3a03859-c965-4381-ae79-03adc9b0e700-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.284949 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3a03859-c965-4381-ae79-03adc9b0e700-kube-api-access-pw5bg" (OuterVolumeSpecName: "kube-api-access-pw5bg") pod "a3a03859-c965-4381-ae79-03adc9b0e700" (UID: "a3a03859-c965-4381-ae79-03adc9b0e700"). InnerVolumeSpecName "kube-api-access-pw5bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.317278 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3a03859-c965-4381-ae79-03adc9b0e700-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3a03859-c965-4381-ae79-03adc9b0e700" (UID: "a3a03859-c965-4381-ae79-03adc9b0e700"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.382445 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3a03859-c965-4381-ae79-03adc9b0e700-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.382478 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw5bg\" (UniqueName: \"kubernetes.io/projected/a3a03859-c965-4381-ae79-03adc9b0e700-kube-api-access-pw5bg\") on node \"crc\" DevicePath \"\"" Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.682131 4786 generic.go:334] "Generic (PLEG): container finished" podID="a3a03859-c965-4381-ae79-03adc9b0e700" containerID="88d6ffbd2c5f74504a540963fc4415571b05bd51298705624b91104eac50b103" exitCode=0 Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.682168 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk5wh" event={"ID":"a3a03859-c965-4381-ae79-03adc9b0e700","Type":"ContainerDied","Data":"88d6ffbd2c5f74504a540963fc4415571b05bd51298705624b91104eac50b103"} Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.682194 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kk5wh" Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.682213 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kk5wh" event={"ID":"a3a03859-c965-4381-ae79-03adc9b0e700","Type":"ContainerDied","Data":"7e3aa8e97e20caaf9e97a5950d32f2307f990418df6b3b64b9401c378fb2807c"} Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.682232 4786 scope.go:117] "RemoveContainer" containerID="88d6ffbd2c5f74504a540963fc4415571b05bd51298705624b91104eac50b103" Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.707531 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kk5wh"] Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.707655 4786 scope.go:117] "RemoveContainer" containerID="6b70f30283ecd7aa8b55bf7f099c72712015dd9e7715b9db90b5bbe0cf8f21ac" Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.714189 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kk5wh"] Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.735720 4786 scope.go:117] "RemoveContainer" containerID="a53ccc91754ffcafc97bbb5ace2013d89362e1fe4537d0f5f2f08e36943d6372" Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.757753 4786 scope.go:117] "RemoveContainer" containerID="88d6ffbd2c5f74504a540963fc4415571b05bd51298705624b91104eac50b103" Oct 02 07:34:35 crc kubenswrapper[4786]: E1002 07:34:35.758125 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88d6ffbd2c5f74504a540963fc4415571b05bd51298705624b91104eac50b103\": container with ID starting with 88d6ffbd2c5f74504a540963fc4415571b05bd51298705624b91104eac50b103 not found: ID does not exist" containerID="88d6ffbd2c5f74504a540963fc4415571b05bd51298705624b91104eac50b103" Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.758157 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88d6ffbd2c5f74504a540963fc4415571b05bd51298705624b91104eac50b103"} err="failed to get container status \"88d6ffbd2c5f74504a540963fc4415571b05bd51298705624b91104eac50b103\": rpc error: code = NotFound desc = could not find container \"88d6ffbd2c5f74504a540963fc4415571b05bd51298705624b91104eac50b103\": container with ID starting with 88d6ffbd2c5f74504a540963fc4415571b05bd51298705624b91104eac50b103 not found: ID does not exist" Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.758180 4786 scope.go:117] "RemoveContainer" containerID="6b70f30283ecd7aa8b55bf7f099c72712015dd9e7715b9db90b5bbe0cf8f21ac" Oct 02 07:34:35 crc kubenswrapper[4786]: E1002 07:34:35.758405 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b70f30283ecd7aa8b55bf7f099c72712015dd9e7715b9db90b5bbe0cf8f21ac\": container with ID starting with 6b70f30283ecd7aa8b55bf7f099c72712015dd9e7715b9db90b5bbe0cf8f21ac not found: ID does not exist" containerID="6b70f30283ecd7aa8b55bf7f099c72712015dd9e7715b9db90b5bbe0cf8f21ac" Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.758430 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b70f30283ecd7aa8b55bf7f099c72712015dd9e7715b9db90b5bbe0cf8f21ac"} err="failed to get container status \"6b70f30283ecd7aa8b55bf7f099c72712015dd9e7715b9db90b5bbe0cf8f21ac\": rpc error: code = NotFound desc = could not find container \"6b70f30283ecd7aa8b55bf7f099c72712015dd9e7715b9db90b5bbe0cf8f21ac\": container with ID starting with 6b70f30283ecd7aa8b55bf7f099c72712015dd9e7715b9db90b5bbe0cf8f21ac not found: ID does not exist" Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.758446 4786 scope.go:117] "RemoveContainer" containerID="a53ccc91754ffcafc97bbb5ace2013d89362e1fe4537d0f5f2f08e36943d6372" Oct 02 07:34:35 crc kubenswrapper[4786]: E1002 07:34:35.758632 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a53ccc91754ffcafc97bbb5ace2013d89362e1fe4537d0f5f2f08e36943d6372\": container with ID starting with a53ccc91754ffcafc97bbb5ace2013d89362e1fe4537d0f5f2f08e36943d6372 not found: ID does not exist" containerID="a53ccc91754ffcafc97bbb5ace2013d89362e1fe4537d0f5f2f08e36943d6372" Oct 02 07:34:35 crc kubenswrapper[4786]: I1002 07:34:35.758654 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a53ccc91754ffcafc97bbb5ace2013d89362e1fe4537d0f5f2f08e36943d6372"} err="failed to get container status \"a53ccc91754ffcafc97bbb5ace2013d89362e1fe4537d0f5f2f08e36943d6372\": rpc error: code = NotFound desc = could not find container \"a53ccc91754ffcafc97bbb5ace2013d89362e1fe4537d0f5f2f08e36943d6372\": container with ID starting with a53ccc91754ffcafc97bbb5ace2013d89362e1fe4537d0f5f2f08e36943d6372 not found: ID does not exist" Oct 02 07:34:36 crc kubenswrapper[4786]: I1002 07:34:36.189836 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3a03859-c965-4381-ae79-03adc9b0e700" path="/var/lib/kubelet/pods/a3a03859-c965-4381-ae79-03adc9b0e700/volumes" Oct 02 07:34:37 crc kubenswrapper[4786]: I1002 07:34:37.386649 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jkm8t" Oct 02 07:34:37 crc kubenswrapper[4786]: I1002 07:34:37.424222 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jkm8t" Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.005292 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jkm8t"] Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.005801 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jkm8t" podUID="7aa8afa9-2afe-4918-99a0-367734537c98" containerName="registry-server" containerID="cri-o://b78e9ccff38753ca5b0b3ca784f3733a1a2d1ee4dcadcaa52aff7603640c3c47" gracePeriod=2 Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.396262 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jkm8t" Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.555295 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa8afa9-2afe-4918-99a0-367734537c98-utilities\") pod \"7aa8afa9-2afe-4918-99a0-367734537c98\" (UID: \"7aa8afa9-2afe-4918-99a0-367734537c98\") " Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.555357 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa8afa9-2afe-4918-99a0-367734537c98-catalog-content\") pod \"7aa8afa9-2afe-4918-99a0-367734537c98\" (UID: \"7aa8afa9-2afe-4918-99a0-367734537c98\") " Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.555456 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5hdf\" (UniqueName: \"kubernetes.io/projected/7aa8afa9-2afe-4918-99a0-367734537c98-kube-api-access-n5hdf\") pod \"7aa8afa9-2afe-4918-99a0-367734537c98\" (UID: \"7aa8afa9-2afe-4918-99a0-367734537c98\") " Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.556554 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aa8afa9-2afe-4918-99a0-367734537c98-utilities" (OuterVolumeSpecName: "utilities") pod "7aa8afa9-2afe-4918-99a0-367734537c98" (UID: "7aa8afa9-2afe-4918-99a0-367734537c98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.568755 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa8afa9-2afe-4918-99a0-367734537c98-kube-api-access-n5hdf" (OuterVolumeSpecName: "kube-api-access-n5hdf") pod "7aa8afa9-2afe-4918-99a0-367734537c98" (UID: "7aa8afa9-2afe-4918-99a0-367734537c98"). InnerVolumeSpecName "kube-api-access-n5hdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.592200 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aa8afa9-2afe-4918-99a0-367734537c98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7aa8afa9-2afe-4918-99a0-367734537c98" (UID: "7aa8afa9-2afe-4918-99a0-367734537c98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.657966 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5hdf\" (UniqueName: \"kubernetes.io/projected/7aa8afa9-2afe-4918-99a0-367734537c98-kube-api-access-n5hdf\") on node \"crc\" DevicePath \"\"" Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.658232 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa8afa9-2afe-4918-99a0-367734537c98-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.658290 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa8afa9-2afe-4918-99a0-367734537c98-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.725421 4786 generic.go:334] "Generic (PLEG): container finished" podID="7aa8afa9-2afe-4918-99a0-367734537c98" containerID="b78e9ccff38753ca5b0b3ca784f3733a1a2d1ee4dcadcaa52aff7603640c3c47" exitCode=0 Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.725457 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jkm8t" event={"ID":"7aa8afa9-2afe-4918-99a0-367734537c98","Type":"ContainerDied","Data":"b78e9ccff38753ca5b0b3ca784f3733a1a2d1ee4dcadcaa52aff7603640c3c47"} Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.725479 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jkm8t" event={"ID":"7aa8afa9-2afe-4918-99a0-367734537c98","Type":"ContainerDied","Data":"151f5db54e68bdae607e11ecab79752317fbdbfbf943eec2606a3bb24238f6c3"} Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.725460 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jkm8t" Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.725496 4786 scope.go:117] "RemoveContainer" containerID="b78e9ccff38753ca5b0b3ca784f3733a1a2d1ee4dcadcaa52aff7603640c3c47" Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.738395 4786 scope.go:117] "RemoveContainer" containerID="0dcc9c60bbf211537ad9acbb6d8c03d985921a9d7661358cbbc4afd13d912b8c" Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.751584 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jkm8t"] Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.754730 4786 scope.go:117] "RemoveContainer" containerID="2f5786ce6456ff72f50ef7ea365a8a1a81ba095a03e3eca328acaf30db46002b" Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.758281 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jkm8t"] Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.790496 4786 scope.go:117] "RemoveContainer" containerID="b78e9ccff38753ca5b0b3ca784f3733a1a2d1ee4dcadcaa52aff7603640c3c47" Oct 02 07:34:39 crc kubenswrapper[4786]: E1002 07:34:39.790791 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b78e9ccff38753ca5b0b3ca784f3733a1a2d1ee4dcadcaa52aff7603640c3c47\": container with ID starting with b78e9ccff38753ca5b0b3ca784f3733a1a2d1ee4dcadcaa52aff7603640c3c47 not found: ID does not exist" containerID="b78e9ccff38753ca5b0b3ca784f3733a1a2d1ee4dcadcaa52aff7603640c3c47" Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.790832 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b78e9ccff38753ca5b0b3ca784f3733a1a2d1ee4dcadcaa52aff7603640c3c47"} err="failed to get container status \"b78e9ccff38753ca5b0b3ca784f3733a1a2d1ee4dcadcaa52aff7603640c3c47\": rpc error: code = NotFound desc = could not find container \"b78e9ccff38753ca5b0b3ca784f3733a1a2d1ee4dcadcaa52aff7603640c3c47\": container with ID starting with b78e9ccff38753ca5b0b3ca784f3733a1a2d1ee4dcadcaa52aff7603640c3c47 not found: ID does not exist" Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.790857 4786 scope.go:117] "RemoveContainer" containerID="0dcc9c60bbf211537ad9acbb6d8c03d985921a9d7661358cbbc4afd13d912b8c" Oct 02 07:34:39 crc kubenswrapper[4786]: E1002 07:34:39.791130 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dcc9c60bbf211537ad9acbb6d8c03d985921a9d7661358cbbc4afd13d912b8c\": container with ID starting with 0dcc9c60bbf211537ad9acbb6d8c03d985921a9d7661358cbbc4afd13d912b8c not found: ID does not exist" containerID="0dcc9c60bbf211537ad9acbb6d8c03d985921a9d7661358cbbc4afd13d912b8c" Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.791159 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dcc9c60bbf211537ad9acbb6d8c03d985921a9d7661358cbbc4afd13d912b8c"} err="failed to get container status \"0dcc9c60bbf211537ad9acbb6d8c03d985921a9d7661358cbbc4afd13d912b8c\": rpc error: code = NotFound desc = could not find container \"0dcc9c60bbf211537ad9acbb6d8c03d985921a9d7661358cbbc4afd13d912b8c\": container with ID starting with 0dcc9c60bbf211537ad9acbb6d8c03d985921a9d7661358cbbc4afd13d912b8c not found: ID does not exist" Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.791179 4786 scope.go:117] "RemoveContainer" containerID="2f5786ce6456ff72f50ef7ea365a8a1a81ba095a03e3eca328acaf30db46002b" Oct 02 07:34:39 crc kubenswrapper[4786]: E1002 07:34:39.791429 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f5786ce6456ff72f50ef7ea365a8a1a81ba095a03e3eca328acaf30db46002b\": container with ID starting with 2f5786ce6456ff72f50ef7ea365a8a1a81ba095a03e3eca328acaf30db46002b not found: ID does not exist" containerID="2f5786ce6456ff72f50ef7ea365a8a1a81ba095a03e3eca328acaf30db46002b" Oct 02 07:34:39 crc kubenswrapper[4786]: I1002 07:34:39.791449 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f5786ce6456ff72f50ef7ea365a8a1a81ba095a03e3eca328acaf30db46002b"} err="failed to get container status \"2f5786ce6456ff72f50ef7ea365a8a1a81ba095a03e3eca328acaf30db46002b\": rpc error: code = NotFound desc = could not find container \"2f5786ce6456ff72f50ef7ea365a8a1a81ba095a03e3eca328acaf30db46002b\": container with ID starting with 2f5786ce6456ff72f50ef7ea365a8a1a81ba095a03e3eca328acaf30db46002b not found: ID does not exist" Oct 02 07:34:40 crc kubenswrapper[4786]: I1002 07:34:40.187159 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aa8afa9-2afe-4918-99a0-367734537c98" path="/var/lib/kubelet/pods/7aa8afa9-2afe-4918-99a0-367734537c98/volumes" Oct 02 07:34:57 crc kubenswrapper[4786]: I1002 07:34:57.497248 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:34:57 crc kubenswrapper[4786]: I1002 07:34:57.497664 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:34:57 crc kubenswrapper[4786]: I1002 07:34:57.497931 4786 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" Oct 02 07:34:57 crc kubenswrapper[4786]: I1002 07:34:57.498502 4786 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5"} pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 07:34:57 crc kubenswrapper[4786]: I1002 07:34:57.498572 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" containerID="cri-o://55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" gracePeriod=600 Oct 02 07:34:57 crc kubenswrapper[4786]: E1002 07:34:57.632750 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:34:57 crc kubenswrapper[4786]: I1002 07:34:57.866568 4786 generic.go:334] "Generic (PLEG): container finished" podID="79cb22df-4930-4aed-9108-1056074d1000" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" exitCode=0 Oct 02 07:34:57 crc kubenswrapper[4786]: I1002 07:34:57.866605 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerDied","Data":"55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5"} Oct 02 07:34:57 crc kubenswrapper[4786]: I1002 07:34:57.866633 4786 scope.go:117] "RemoveContainer" containerID="834d772f522a6b607bc93d02371b22f2e883bf16a4b9df657cb1c41e0c68d418" Oct 02 07:34:57 crc kubenswrapper[4786]: I1002 07:34:57.867071 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:34:57 crc kubenswrapper[4786]: E1002 07:34:57.867381 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:35:10 crc kubenswrapper[4786]: I1002 07:35:10.953499 4786 generic.go:334] "Generic (PLEG): container finished" podID="67f1c797-1feb-4f08-8211-38d2375023b0" containerID="be346b664ec44ba5400310bd5c21c62d9eb187f0b9397c405dc0153ce3ec446f" exitCode=0 Oct 02 07:35:10 crc kubenswrapper[4786]: I1002 07:35:10.953582 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7wpts/crc-debug-qzlhf" event={"ID":"67f1c797-1feb-4f08-8211-38d2375023b0","Type":"ContainerDied","Data":"be346b664ec44ba5400310bd5c21c62d9eb187f0b9397c405dc0153ce3ec446f"} Oct 02 07:35:12 crc kubenswrapper[4786]: I1002 07:35:12.032043 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wpts/crc-debug-qzlhf" Oct 02 07:35:12 crc kubenswrapper[4786]: I1002 07:35:12.052001 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7wpts/crc-debug-qzlhf"] Oct 02 07:35:12 crc kubenswrapper[4786]: I1002 07:35:12.056822 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7wpts/crc-debug-qzlhf"] Oct 02 07:35:12 crc kubenswrapper[4786]: I1002 07:35:12.178664 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:35:12 crc kubenswrapper[4786]: E1002 07:35:12.178966 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:35:12 crc kubenswrapper[4786]: I1002 07:35:12.224903 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67f1c797-1feb-4f08-8211-38d2375023b0-host\") pod \"67f1c797-1feb-4f08-8211-38d2375023b0\" (UID: \"67f1c797-1feb-4f08-8211-38d2375023b0\") " Oct 02 07:35:12 crc kubenswrapper[4786]: I1002 07:35:12.224946 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67f1c797-1feb-4f08-8211-38d2375023b0-host" (OuterVolumeSpecName: "host") pod "67f1c797-1feb-4f08-8211-38d2375023b0" (UID: "67f1c797-1feb-4f08-8211-38d2375023b0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 07:35:12 crc kubenswrapper[4786]: I1002 07:35:12.225023 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kf4l\" (UniqueName: \"kubernetes.io/projected/67f1c797-1feb-4f08-8211-38d2375023b0-kube-api-access-9kf4l\") pod \"67f1c797-1feb-4f08-8211-38d2375023b0\" (UID: \"67f1c797-1feb-4f08-8211-38d2375023b0\") " Oct 02 07:35:12 crc kubenswrapper[4786]: I1002 07:35:12.225478 4786 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67f1c797-1feb-4f08-8211-38d2375023b0-host\") on node \"crc\" DevicePath \"\"" Oct 02 07:35:12 crc kubenswrapper[4786]: I1002 07:35:12.229540 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67f1c797-1feb-4f08-8211-38d2375023b0-kube-api-access-9kf4l" (OuterVolumeSpecName: "kube-api-access-9kf4l") pod "67f1c797-1feb-4f08-8211-38d2375023b0" (UID: "67f1c797-1feb-4f08-8211-38d2375023b0"). InnerVolumeSpecName "kube-api-access-9kf4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:35:12 crc kubenswrapper[4786]: I1002 07:35:12.326559 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kf4l\" (UniqueName: \"kubernetes.io/projected/67f1c797-1feb-4f08-8211-38d2375023b0-kube-api-access-9kf4l\") on node \"crc\" DevicePath \"\"" Oct 02 07:35:12 crc kubenswrapper[4786]: I1002 07:35:12.969039 4786 scope.go:117] "RemoveContainer" containerID="be346b664ec44ba5400310bd5c21c62d9eb187f0b9397c405dc0153ce3ec446f" Oct 02 07:35:12 crc kubenswrapper[4786]: I1002 07:35:12.969077 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wpts/crc-debug-qzlhf" Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.164738 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7wpts/crc-debug-hrdrq"] Oct 02 07:35:13 crc kubenswrapper[4786]: E1002 07:35:13.165034 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa8afa9-2afe-4918-99a0-367734537c98" containerName="extract-utilities" Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.165057 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa8afa9-2afe-4918-99a0-367734537c98" containerName="extract-utilities" Oct 02 07:35:13 crc kubenswrapper[4786]: E1002 07:35:13.165080 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa8afa9-2afe-4918-99a0-367734537c98" containerName="registry-server" Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.165086 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa8afa9-2afe-4918-99a0-367734537c98" containerName="registry-server" Oct 02 07:35:13 crc kubenswrapper[4786]: E1002 07:35:13.165094 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3a03859-c965-4381-ae79-03adc9b0e700" containerName="extract-utilities" Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.165101 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a03859-c965-4381-ae79-03adc9b0e700" containerName="extract-utilities" Oct 02 07:35:13 crc kubenswrapper[4786]: E1002 07:35:13.165108 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa8afa9-2afe-4918-99a0-367734537c98" containerName="extract-content" Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.165112 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa8afa9-2afe-4918-99a0-367734537c98" containerName="extract-content" Oct 02 07:35:13 crc kubenswrapper[4786]: E1002 07:35:13.165125 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3a03859-c965-4381-ae79-03adc9b0e700" containerName="registry-server" Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.165129 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a03859-c965-4381-ae79-03adc9b0e700" containerName="registry-server" Oct 02 07:35:13 crc kubenswrapper[4786]: E1002 07:35:13.165142 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67f1c797-1feb-4f08-8211-38d2375023b0" containerName="container-00" Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.165147 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f1c797-1feb-4f08-8211-38d2375023b0" containerName="container-00" Oct 02 07:35:13 crc kubenswrapper[4786]: E1002 07:35:13.165161 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3a03859-c965-4381-ae79-03adc9b0e700" containerName="extract-content" Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.165166 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a03859-c965-4381-ae79-03adc9b0e700" containerName="extract-content" Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.165321 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="67f1c797-1feb-4f08-8211-38d2375023b0" containerName="container-00" Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.165332 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa8afa9-2afe-4918-99a0-367734537c98" containerName="registry-server" Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.165350 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3a03859-c965-4381-ae79-03adc9b0e700" containerName="registry-server" Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.165894 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wpts/crc-debug-hrdrq" Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.343140 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad4926cf-7279-42e4-89e8-29c58c0bba5e-host\") pod \"crc-debug-hrdrq\" (UID: \"ad4926cf-7279-42e4-89e8-29c58c0bba5e\") " pod="openshift-must-gather-7wpts/crc-debug-hrdrq" Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.343178 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbcw8\" (UniqueName: \"kubernetes.io/projected/ad4926cf-7279-42e4-89e8-29c58c0bba5e-kube-api-access-nbcw8\") pod \"crc-debug-hrdrq\" (UID: \"ad4926cf-7279-42e4-89e8-29c58c0bba5e\") " pod="openshift-must-gather-7wpts/crc-debug-hrdrq" Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.444419 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad4926cf-7279-42e4-89e8-29c58c0bba5e-host\") pod \"crc-debug-hrdrq\" (UID: \"ad4926cf-7279-42e4-89e8-29c58c0bba5e\") " pod="openshift-must-gather-7wpts/crc-debug-hrdrq" Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.444468 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbcw8\" (UniqueName: \"kubernetes.io/projected/ad4926cf-7279-42e4-89e8-29c58c0bba5e-kube-api-access-nbcw8\") pod \"crc-debug-hrdrq\" (UID: \"ad4926cf-7279-42e4-89e8-29c58c0bba5e\") " pod="openshift-must-gather-7wpts/crc-debug-hrdrq" Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.444574 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad4926cf-7279-42e4-89e8-29c58c0bba5e-host\") pod \"crc-debug-hrdrq\" (UID: \"ad4926cf-7279-42e4-89e8-29c58c0bba5e\") " pod="openshift-must-gather-7wpts/crc-debug-hrdrq" Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.461815 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbcw8\" (UniqueName: \"kubernetes.io/projected/ad4926cf-7279-42e4-89e8-29c58c0bba5e-kube-api-access-nbcw8\") pod \"crc-debug-hrdrq\" (UID: \"ad4926cf-7279-42e4-89e8-29c58c0bba5e\") " pod="openshift-must-gather-7wpts/crc-debug-hrdrq" Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.479489 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wpts/crc-debug-hrdrq" Oct 02 07:35:13 crc kubenswrapper[4786]: W1002 07:35:13.500364 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad4926cf_7279_42e4_89e8_29c58c0bba5e.slice/crio-f298d568b8523e2f3557b0cb0026bc4c8869608a40aa2f69c783a24a2a9de873 WatchSource:0}: Error finding container f298d568b8523e2f3557b0cb0026bc4c8869608a40aa2f69c783a24a2a9de873: Status 404 returned error can't find the container with id f298d568b8523e2f3557b0cb0026bc4c8869608a40aa2f69c783a24a2a9de873 Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.978232 4786 generic.go:334] "Generic (PLEG): container finished" podID="ad4926cf-7279-42e4-89e8-29c58c0bba5e" containerID="4d1074959cdbca1af8404b617dc346ba1e9b59eb4653bb50fc696c3c3e1355ed" exitCode=0 Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.978307 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7wpts/crc-debug-hrdrq" event={"ID":"ad4926cf-7279-42e4-89e8-29c58c0bba5e","Type":"ContainerDied","Data":"4d1074959cdbca1af8404b617dc346ba1e9b59eb4653bb50fc696c3c3e1355ed"} Oct 02 07:35:13 crc kubenswrapper[4786]: I1002 07:35:13.978441 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7wpts/crc-debug-hrdrq" event={"ID":"ad4926cf-7279-42e4-89e8-29c58c0bba5e","Type":"ContainerStarted","Data":"f298d568b8523e2f3557b0cb0026bc4c8869608a40aa2f69c783a24a2a9de873"} Oct 02 07:35:14 crc kubenswrapper[4786]: I1002 07:35:14.190462 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67f1c797-1feb-4f08-8211-38d2375023b0" path="/var/lib/kubelet/pods/67f1c797-1feb-4f08-8211-38d2375023b0/volumes" Oct 02 07:35:15 crc kubenswrapper[4786]: I1002 07:35:15.063329 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wpts/crc-debug-hrdrq" Oct 02 07:35:15 crc kubenswrapper[4786]: I1002 07:35:15.167039 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbcw8\" (UniqueName: \"kubernetes.io/projected/ad4926cf-7279-42e4-89e8-29c58c0bba5e-kube-api-access-nbcw8\") pod \"ad4926cf-7279-42e4-89e8-29c58c0bba5e\" (UID: \"ad4926cf-7279-42e4-89e8-29c58c0bba5e\") " Oct 02 07:35:15 crc kubenswrapper[4786]: I1002 07:35:15.167268 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad4926cf-7279-42e4-89e8-29c58c0bba5e-host\") pod \"ad4926cf-7279-42e4-89e8-29c58c0bba5e\" (UID: \"ad4926cf-7279-42e4-89e8-29c58c0bba5e\") " Oct 02 07:35:15 crc kubenswrapper[4786]: I1002 07:35:15.167496 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad4926cf-7279-42e4-89e8-29c58c0bba5e-host" (OuterVolumeSpecName: "host") pod "ad4926cf-7279-42e4-89e8-29c58c0bba5e" (UID: "ad4926cf-7279-42e4-89e8-29c58c0bba5e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 07:35:15 crc kubenswrapper[4786]: I1002 07:35:15.168098 4786 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad4926cf-7279-42e4-89e8-29c58c0bba5e-host\") on node \"crc\" DevicePath \"\"" Oct 02 07:35:15 crc kubenswrapper[4786]: I1002 07:35:15.171462 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad4926cf-7279-42e4-89e8-29c58c0bba5e-kube-api-access-nbcw8" (OuterVolumeSpecName: "kube-api-access-nbcw8") pod "ad4926cf-7279-42e4-89e8-29c58c0bba5e" (UID: "ad4926cf-7279-42e4-89e8-29c58c0bba5e"). InnerVolumeSpecName "kube-api-access-nbcw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:35:15 crc kubenswrapper[4786]: I1002 07:35:15.269295 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbcw8\" (UniqueName: \"kubernetes.io/projected/ad4926cf-7279-42e4-89e8-29c58c0bba5e-kube-api-access-nbcw8\") on node \"crc\" DevicePath \"\"" Oct 02 07:35:15 crc kubenswrapper[4786]: I1002 07:35:15.995069 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7wpts/crc-debug-hrdrq" event={"ID":"ad4926cf-7279-42e4-89e8-29c58c0bba5e","Type":"ContainerDied","Data":"f298d568b8523e2f3557b0cb0026bc4c8869608a40aa2f69c783a24a2a9de873"} Oct 02 07:35:15 crc kubenswrapper[4786]: I1002 07:35:15.995276 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f298d568b8523e2f3557b0cb0026bc4c8869608a40aa2f69c783a24a2a9de873" Oct 02 07:35:15 crc kubenswrapper[4786]: I1002 07:35:15.995100 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wpts/crc-debug-hrdrq" Oct 02 07:35:18 crc kubenswrapper[4786]: I1002 07:35:18.814670 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7wpts/crc-debug-hrdrq"] Oct 02 07:35:18 crc kubenswrapper[4786]: I1002 07:35:18.820391 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7wpts/crc-debug-hrdrq"] Oct 02 07:35:19 crc kubenswrapper[4786]: I1002 07:35:19.921979 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7wpts/crc-debug-vc2wd"] Oct 02 07:35:19 crc kubenswrapper[4786]: E1002 07:35:19.922520 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad4926cf-7279-42e4-89e8-29c58c0bba5e" containerName="container-00" Oct 02 07:35:19 crc kubenswrapper[4786]: I1002 07:35:19.922532 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4926cf-7279-42e4-89e8-29c58c0bba5e" containerName="container-00" Oct 02 07:35:19 crc kubenswrapper[4786]: I1002 07:35:19.922715 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad4926cf-7279-42e4-89e8-29c58c0bba5e" containerName="container-00" Oct 02 07:35:19 crc kubenswrapper[4786]: I1002 07:35:19.923232 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wpts/crc-debug-vc2wd" Oct 02 07:35:19 crc kubenswrapper[4786]: I1002 07:35:19.926803 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txjcz\" (UniqueName: \"kubernetes.io/projected/a14503c4-6df4-42bf-99f8-e8dfefb7680a-kube-api-access-txjcz\") pod \"crc-debug-vc2wd\" (UID: \"a14503c4-6df4-42bf-99f8-e8dfefb7680a\") " pod="openshift-must-gather-7wpts/crc-debug-vc2wd" Oct 02 07:35:19 crc kubenswrapper[4786]: I1002 07:35:19.926846 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a14503c4-6df4-42bf-99f8-e8dfefb7680a-host\") pod \"crc-debug-vc2wd\" (UID: \"a14503c4-6df4-42bf-99f8-e8dfefb7680a\") " pod="openshift-must-gather-7wpts/crc-debug-vc2wd" Oct 02 07:35:20 crc kubenswrapper[4786]: I1002 07:35:20.028351 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txjcz\" (UniqueName: \"kubernetes.io/projected/a14503c4-6df4-42bf-99f8-e8dfefb7680a-kube-api-access-txjcz\") pod \"crc-debug-vc2wd\" (UID: \"a14503c4-6df4-42bf-99f8-e8dfefb7680a\") " pod="openshift-must-gather-7wpts/crc-debug-vc2wd" Oct 02 07:35:20 crc kubenswrapper[4786]: I1002 07:35:20.028395 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a14503c4-6df4-42bf-99f8-e8dfefb7680a-host\") pod \"crc-debug-vc2wd\" (UID: \"a14503c4-6df4-42bf-99f8-e8dfefb7680a\") " pod="openshift-must-gather-7wpts/crc-debug-vc2wd" Oct 02 07:35:20 crc kubenswrapper[4786]: I1002 07:35:20.028571 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a14503c4-6df4-42bf-99f8-e8dfefb7680a-host\") pod \"crc-debug-vc2wd\" (UID: \"a14503c4-6df4-42bf-99f8-e8dfefb7680a\") " pod="openshift-must-gather-7wpts/crc-debug-vc2wd" Oct 02 07:35:20 crc kubenswrapper[4786]: I1002 07:35:20.043888 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txjcz\" (UniqueName: \"kubernetes.io/projected/a14503c4-6df4-42bf-99f8-e8dfefb7680a-kube-api-access-txjcz\") pod \"crc-debug-vc2wd\" (UID: \"a14503c4-6df4-42bf-99f8-e8dfefb7680a\") " pod="openshift-must-gather-7wpts/crc-debug-vc2wd" Oct 02 07:35:20 crc kubenswrapper[4786]: I1002 07:35:20.186825 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad4926cf-7279-42e4-89e8-29c58c0bba5e" path="/var/lib/kubelet/pods/ad4926cf-7279-42e4-89e8-29c58c0bba5e/volumes" Oct 02 07:35:20 crc kubenswrapper[4786]: I1002 07:35:20.237840 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wpts/crc-debug-vc2wd" Oct 02 07:35:21 crc kubenswrapper[4786]: I1002 07:35:21.029770 4786 generic.go:334] "Generic (PLEG): container finished" podID="a14503c4-6df4-42bf-99f8-e8dfefb7680a" containerID="8e54d3c09de1986fe13c6f7fd8f6c7920fe529157f4b756210005b6804b4b18e" exitCode=0 Oct 02 07:35:21 crc kubenswrapper[4786]: I1002 07:35:21.029872 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7wpts/crc-debug-vc2wd" event={"ID":"a14503c4-6df4-42bf-99f8-e8dfefb7680a","Type":"ContainerDied","Data":"8e54d3c09de1986fe13c6f7fd8f6c7920fe529157f4b756210005b6804b4b18e"} Oct 02 07:35:21 crc kubenswrapper[4786]: I1002 07:35:21.030146 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7wpts/crc-debug-vc2wd" event={"ID":"a14503c4-6df4-42bf-99f8-e8dfefb7680a","Type":"ContainerStarted","Data":"2598a07a5b739d782d0bad7702c4ec64340a43ae23e340b8f5bd76b2610b433d"} Oct 02 07:35:21 crc kubenswrapper[4786]: I1002 07:35:21.060400 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7wpts/crc-debug-vc2wd"] Oct 02 07:35:21 crc kubenswrapper[4786]: I1002 07:35:21.066614 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7wpts/crc-debug-vc2wd"] Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.116254 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n_78d8253b-165c-4e4d-9c1a-2a22c6828e08/util/0.log" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.136804 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wpts/crc-debug-vc2wd" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.162423 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txjcz\" (UniqueName: \"kubernetes.io/projected/a14503c4-6df4-42bf-99f8-e8dfefb7680a-kube-api-access-txjcz\") pod \"a14503c4-6df4-42bf-99f8-e8dfefb7680a\" (UID: \"a14503c4-6df4-42bf-99f8-e8dfefb7680a\") " Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.162461 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a14503c4-6df4-42bf-99f8-e8dfefb7680a-host\") pod \"a14503c4-6df4-42bf-99f8-e8dfefb7680a\" (UID: \"a14503c4-6df4-42bf-99f8-e8dfefb7680a\") " Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.162592 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a14503c4-6df4-42bf-99f8-e8dfefb7680a-host" (OuterVolumeSpecName: "host") pod "a14503c4-6df4-42bf-99f8-e8dfefb7680a" (UID: "a14503c4-6df4-42bf-99f8-e8dfefb7680a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.163021 4786 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a14503c4-6df4-42bf-99f8-e8dfefb7680a-host\") on node \"crc\" DevicePath \"\"" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.167039 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a14503c4-6df4-42bf-99f8-e8dfefb7680a-kube-api-access-txjcz" (OuterVolumeSpecName: "kube-api-access-txjcz") pod "a14503c4-6df4-42bf-99f8-e8dfefb7680a" (UID: "a14503c4-6df4-42bf-99f8-e8dfefb7680a"). InnerVolumeSpecName "kube-api-access-txjcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.188208 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a14503c4-6df4-42bf-99f8-e8dfefb7680a" path="/var/lib/kubelet/pods/a14503c4-6df4-42bf-99f8-e8dfefb7680a/volumes" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.236275 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n_78d8253b-165c-4e4d-9c1a-2a22c6828e08/util/0.log" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.237058 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n_78d8253b-165c-4e4d-9c1a-2a22c6828e08/pull/0.log" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.262097 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n_78d8253b-165c-4e4d-9c1a-2a22c6828e08/pull/0.log" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.264781 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txjcz\" (UniqueName: \"kubernetes.io/projected/a14503c4-6df4-42bf-99f8-e8dfefb7680a-kube-api-access-txjcz\") on node \"crc\" DevicePath \"\"" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.387904 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n_78d8253b-165c-4e4d-9c1a-2a22c6828e08/util/0.log" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.409757 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n_78d8253b-165c-4e4d-9c1a-2a22c6828e08/pull/0.log" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.429759 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n_78d8253b-165c-4e4d-9c1a-2a22c6828e08/extract/0.log" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.515020 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-f7f98cb69-dkqvh_0872cd5d-1fde-4b27-bdb7-6eade27cee9d/kube-rbac-proxy/0.log" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.573295 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-f7f98cb69-dkqvh_0872cd5d-1fde-4b27-bdb7-6eade27cee9d/manager/0.log" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.591022 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859cd486d-5b4mb_90cbbf62-df45-490f-a76d-6b24fdfe6aa7/kube-rbac-proxy/0.log" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.695331 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859cd486d-5b4mb_90cbbf62-df45-490f-a76d-6b24fdfe6aa7/manager/0.log" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.705806 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-77fb7bcf5b-4rqms_e229888c-d55e-4b2f-a53a-e588868f98e2/kube-rbac-proxy/0.log" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.732957 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-77fb7bcf5b-4rqms_e229888c-d55e-4b2f-a53a-e588868f98e2/manager/0.log" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.836254 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8bc4775b5-w6s6f_40d01a1d-0613-43eb-824b-24b22a879822/kube-rbac-proxy/0.log" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.928436 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8bc4775b5-w6s6f_40d01a1d-0613-43eb-824b-24b22a879822/manager/0.log" Oct 02 07:35:22 crc kubenswrapper[4786]: I1002 07:35:22.972768 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b4fc86755-xqp2m_458e49e3-37fa-4fde-b98e-35f6490ad3bc/manager/0.log" Oct 02 07:35:23 crc kubenswrapper[4786]: I1002 07:35:22.999971 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b4fc86755-xqp2m_458e49e3-37fa-4fde-b98e-35f6490ad3bc/kube-rbac-proxy/0.log" Oct 02 07:35:23 crc kubenswrapper[4786]: I1002 07:35:23.063908 4786 scope.go:117] "RemoveContainer" containerID="8e54d3c09de1986fe13c6f7fd8f6c7920fe529157f4b756210005b6804b4b18e" Oct 02 07:35:23 crc kubenswrapper[4786]: I1002 07:35:23.063945 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wpts/crc-debug-vc2wd" Oct 02 07:35:23 crc kubenswrapper[4786]: I1002 07:35:23.075398 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-679b4759bb-xcd8d_9c27556b-9c6e-4a2d-9c2d-78f471392a85/kube-rbac-proxy/0.log" Oct 02 07:35:23 crc kubenswrapper[4786]: I1002 07:35:23.111294 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-679b4759bb-xcd8d_9c27556b-9c6e-4a2d-9c2d-78f471392a85/manager/0.log" Oct 02 07:35:23 crc kubenswrapper[4786]: I1002 07:35:23.211965 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5c8fdc4d5c-5vzvp_903d28be-eebf-4dd0-bd15-3f3e2a9416bf/kube-rbac-proxy/0.log" Oct 02 07:35:23 crc kubenswrapper[4786]: I1002 07:35:23.327096 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5c8fdc4d5c-5vzvp_903d28be-eebf-4dd0-bd15-3f3e2a9416bf/manager/0.log" Oct 02 07:35:23 crc kubenswrapper[4786]: I1002 07:35:23.329336 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f45cd594f-m99xq_8c50728d-dcc7-4cc1-a0fb-ef8c1355d6f9/kube-rbac-proxy/0.log" Oct 02 07:35:23 crc kubenswrapper[4786]: I1002 07:35:23.379461 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f45cd594f-m99xq_8c50728d-dcc7-4cc1-a0fb-ef8c1355d6f9/manager/0.log" Oct 02 07:35:23 crc kubenswrapper[4786]: I1002 07:35:23.478292 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-59d7dc95cf-jf2qh_bafce6df-ad32-4af9-9e38-d6da16305ee9/kube-rbac-proxy/0.log" Oct 02 07:35:23 crc kubenswrapper[4786]: I1002 07:35:23.488750 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-59d7dc95cf-jf2qh_bafce6df-ad32-4af9-9e38-d6da16305ee9/manager/0.log" Oct 02 07:35:23 crc kubenswrapper[4786]: I1002 07:35:23.586568 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-b7cf8cb5f-x6hzs_e49d1086-42df-449b-97b8-787edd49ba23/kube-rbac-proxy/0.log" Oct 02 07:35:23 crc kubenswrapper[4786]: I1002 07:35:23.633398 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-b7cf8cb5f-x6hzs_e49d1086-42df-449b-97b8-787edd49ba23/manager/0.log" Oct 02 07:35:23 crc kubenswrapper[4786]: I1002 07:35:23.657510 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf5bb885-7xngz_3d6bf83e-81f7-4c56-994f-1058c8ddbe74/kube-rbac-proxy/0.log" Oct 02 07:35:23 crc kubenswrapper[4786]: I1002 07:35:23.759782 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54fbbfcd44-jpgjv_12f9e929-b1eb-4dd9-a686-0154f89b5dfc/kube-rbac-proxy/0.log" Oct 02 07:35:23 crc kubenswrapper[4786]: I1002 07:35:23.770174 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf5bb885-7xngz_3d6bf83e-81f7-4c56-994f-1058c8ddbe74/manager/0.log" Oct 02 07:35:23 crc kubenswrapper[4786]: I1002 07:35:23.961390 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54fbbfcd44-jpgjv_12f9e929-b1eb-4dd9-a686-0154f89b5dfc/manager/0.log" Oct 02 07:35:24 crc kubenswrapper[4786]: I1002 07:35:24.039512 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7fd5b6bbc6-8cfbh_71792270-25e9-4028-b306-c235e6378802/kube-rbac-proxy/0.log" Oct 02 07:35:24 crc kubenswrapper[4786]: I1002 07:35:24.151065 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7fd5b6bbc6-8cfbh_71792270-25e9-4028-b306-c235e6378802/manager/0.log" Oct 02 07:35:24 crc kubenswrapper[4786]: I1002 07:35:24.179883 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:35:24 crc kubenswrapper[4786]: E1002 07:35:24.180081 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:35:24 crc kubenswrapper[4786]: I1002 07:35:24.182667 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-75f8d67d86-xv465_f496c1c2-326e-4429-b192-07a5ca33b28d/kube-rbac-proxy/0.log" Oct 02 07:35:24 crc kubenswrapper[4786]: I1002 07:35:24.228913 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-75f8d67d86-xv465_f496c1c2-326e-4429-b192-07a5ca33b28d/manager/0.log" Oct 02 07:35:24 crc kubenswrapper[4786]: I1002 07:35:24.292323 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-787874f5b7tcwtm_56dd0e8b-9081-4dbc-8683-a0da0f1be122/kube-rbac-proxy/0.log" Oct 02 07:35:24 crc kubenswrapper[4786]: I1002 07:35:24.329296 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-787874f5b7tcwtm_56dd0e8b-9081-4dbc-8683-a0da0f1be122/manager/0.log" Oct 02 07:35:24 crc kubenswrapper[4786]: I1002 07:35:24.416614 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-67698bcd47-cjhhk_8830c499-339f-4252-8f35-13671a0b6687/kube-rbac-proxy/0.log" Oct 02 07:35:24 crc kubenswrapper[4786]: I1002 07:35:24.547906 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-859455d779-ksrkf_f1ba6077-2c56-40f9-bcbc-64bae09186c6/kube-rbac-proxy/0.log" Oct 02 07:35:24 crc kubenswrapper[4786]: I1002 07:35:24.702030 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-859455d779-ksrkf_f1ba6077-2c56-40f9-bcbc-64bae09186c6/operator/0.log" Oct 02 07:35:24 crc kubenswrapper[4786]: I1002 07:35:24.752275 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-l4f42_d3794fc6-7388-43e3-bf57-545f152e19c4/registry-server/0.log" Oct 02 07:35:24 crc kubenswrapper[4786]: I1002 07:35:24.908777 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-84c745747f-p85rq_7aaa7a8a-2599-4f27-9bce-1fcee450fbfa/kube-rbac-proxy/0.log" Oct 02 07:35:24 crc kubenswrapper[4786]: I1002 07:35:24.986314 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-84c745747f-p85rq_7aaa7a8a-2599-4f27-9bce-1fcee450fbfa/manager/0.log" Oct 02 07:35:25 crc kubenswrapper[4786]: I1002 07:35:25.086297 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-598c4c8547-r2qcj_ed52f44e-1a43-4d07-a613-0d5d7c367a51/kube-rbac-proxy/0.log" Oct 02 07:35:25 crc kubenswrapper[4786]: I1002 07:35:25.159424 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-598c4c8547-r2qcj_ed52f44e-1a43-4d07-a613-0d5d7c367a51/manager/0.log" Oct 02 07:35:25 crc kubenswrapper[4786]: I1002 07:35:25.290368 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5_4910bc83-927a-41bb-a4a6-834c09bcbc6e/operator/0.log" Oct 02 07:35:25 crc kubenswrapper[4786]: I1002 07:35:25.342587 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-689b4f76c9-gslcx_8e441bad-8657-4e25-a66b-caf4fc28ae8a/kube-rbac-proxy/0.log" Oct 02 07:35:25 crc kubenswrapper[4786]: I1002 07:35:25.353098 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-67698bcd47-cjhhk_8830c499-339f-4252-8f35-13671a0b6687/manager/0.log" Oct 02 07:35:25 crc kubenswrapper[4786]: I1002 07:35:25.373206 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-689b4f76c9-gslcx_8e441bad-8657-4e25-a66b-caf4fc28ae8a/manager/0.log" Oct 02 07:35:25 crc kubenswrapper[4786]: I1002 07:35:25.438424 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-cb66d6b59-89t2n_ae767b5f-e700-4253-be42-48f90b9fe99e/kube-rbac-proxy/0.log" Oct 02 07:35:25 crc kubenswrapper[4786]: I1002 07:35:25.529595 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-cb66d6b59-89t2n_ae767b5f-e700-4253-be42-48f90b9fe99e/manager/0.log" Oct 02 07:35:25 crc kubenswrapper[4786]: I1002 07:35:25.533226 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-cbdf6dc66-phjsn_fcbaee3e-a527-4eb0-9c2b-7ada804e7920/kube-rbac-proxy/0.log" Oct 02 07:35:25 crc kubenswrapper[4786]: I1002 07:35:25.585966 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-cbdf6dc66-phjsn_fcbaee3e-a527-4eb0-9c2b-7ada804e7920/manager/0.log" Oct 02 07:35:25 crc kubenswrapper[4786]: I1002 07:35:25.667357 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-68d7bc5569-6n59c_810706c5-80b3-47b8-8058-2b0aa1665942/kube-rbac-proxy/0.log" Oct 02 07:35:25 crc kubenswrapper[4786]: I1002 07:35:25.674365 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-68d7bc5569-6n59c_810706c5-80b3-47b8-8058-2b0aa1665942/manager/0.log" Oct 02 07:35:35 crc kubenswrapper[4786]: I1002 07:35:35.475732 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-xw7f8_24e3309e-db58-4fa4-a4f6-08fdb1ddb95c/control-plane-machine-set-operator/0.log" Oct 02 07:35:35 crc kubenswrapper[4786]: I1002 07:35:35.611394 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-97p7p_e0050175-139a-4210-add6-1b7bbe800f27/kube-rbac-proxy/0.log" Oct 02 07:35:35 crc kubenswrapper[4786]: I1002 07:35:35.620144 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-97p7p_e0050175-139a-4210-add6-1b7bbe800f27/machine-api-operator/0.log" Oct 02 07:35:36 crc kubenswrapper[4786]: I1002 07:35:36.182371 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:35:36 crc kubenswrapper[4786]: E1002 07:35:36.182569 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:35:43 crc kubenswrapper[4786]: I1002 07:35:43.250639 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-c7bxl_27ce8ca7-3393-43c6-ac0e-6e4128f84527/cert-manager-controller/0.log" Oct 02 07:35:43 crc kubenswrapper[4786]: I1002 07:35:43.372530 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-8tlx5_5286d0c8-0ce3-4d66-882f-9ffea6c90fa4/cert-manager-cainjector/0.log" Oct 02 07:35:43 crc kubenswrapper[4786]: I1002 07:35:43.405000 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-9wd5r_ca1f4166-adf4-4281-925f-224930e8f775/cert-manager-webhook/0.log" Oct 02 07:35:48 crc kubenswrapper[4786]: I1002 07:35:48.179657 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:35:48 crc kubenswrapper[4786]: E1002 07:35:48.180239 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:35:50 crc kubenswrapper[4786]: I1002 07:35:50.971684 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-z8sr6_fe88424e-e46c-46b3-a2e0-7bba5ef147b3/nmstate-console-plugin/0.log" Oct 02 07:35:51 crc kubenswrapper[4786]: I1002 07:35:51.097251 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-kfbdc_9b208785-3928-4bc7-a6fd-1bcee5029917/nmstate-handler/0.log" Oct 02 07:35:51 crc kubenswrapper[4786]: I1002 07:35:51.144580 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-kgjxr_84995ef7-937a-442e-a016-f22a24d82882/kube-rbac-proxy/0.log" Oct 02 07:35:51 crc kubenswrapper[4786]: I1002 07:35:51.165055 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-kgjxr_84995ef7-937a-442e-a016-f22a24d82882/nmstate-metrics/0.log" Oct 02 07:35:51 crc kubenswrapper[4786]: I1002 07:35:51.291556 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-pn52f_75a4981d-3614-4013-80e8-dcc8cd60da94/nmstate-operator/0.log" Oct 02 07:35:51 crc kubenswrapper[4786]: I1002 07:35:51.337412 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-6szkj_9376f046-2ecd-4c20-bf82-4f18490d91d9/nmstate-webhook/0.log" Oct 02 07:35:59 crc kubenswrapper[4786]: I1002 07:35:59.970912 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-4pdp4_defc1b9a-20a3-4272-af46-ad01ef957dba/kube-rbac-proxy/0.log" Oct 02 07:36:00 crc kubenswrapper[4786]: I1002 07:36:00.108457 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-4pdp4_defc1b9a-20a3-4272-af46-ad01ef957dba/controller/0.log" Oct 02 07:36:00 crc kubenswrapper[4786]: I1002 07:36:00.141702 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-frr-files/0.log" Oct 02 07:36:00 crc kubenswrapper[4786]: I1002 07:36:00.274665 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-metrics/0.log" Oct 02 07:36:00 crc kubenswrapper[4786]: I1002 07:36:00.279223 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-reloader/0.log" Oct 02 07:36:00 crc kubenswrapper[4786]: I1002 07:36:00.290280 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-frr-files/0.log" Oct 02 07:36:00 crc kubenswrapper[4786]: I1002 07:36:00.295764 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-reloader/0.log" Oct 02 07:36:00 crc kubenswrapper[4786]: I1002 07:36:00.414069 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-reloader/0.log" Oct 02 07:36:00 crc kubenswrapper[4786]: I1002 07:36:00.435042 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-metrics/0.log" Oct 02 07:36:00 crc kubenswrapper[4786]: I1002 07:36:00.435042 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-metrics/0.log" Oct 02 07:36:00 crc kubenswrapper[4786]: I1002 07:36:00.435140 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-frr-files/0.log" Oct 02 07:36:00 crc kubenswrapper[4786]: I1002 07:36:00.561717 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-frr-files/0.log" Oct 02 07:36:00 crc kubenswrapper[4786]: I1002 07:36:00.569698 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-reloader/0.log" Oct 02 07:36:00 crc kubenswrapper[4786]: I1002 07:36:00.581287 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-metrics/0.log" Oct 02 07:36:00 crc kubenswrapper[4786]: I1002 07:36:00.587975 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/controller/0.log" Oct 02 07:36:00 crc kubenswrapper[4786]: I1002 07:36:00.732477 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/kube-rbac-proxy-frr/0.log" Oct 02 07:36:00 crc kubenswrapper[4786]: I1002 07:36:00.734262 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/kube-rbac-proxy/0.log" Oct 02 07:36:00 crc kubenswrapper[4786]: I1002 07:36:00.736888 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/frr-metrics/0.log" Oct 02 07:36:00 crc kubenswrapper[4786]: I1002 07:36:00.919753 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-qspcx_bfefe2c3-8e13-45a3-b700-cda75a37345c/frr-k8s-webhook-server/0.log" Oct 02 07:36:00 crc kubenswrapper[4786]: I1002 07:36:00.940282 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/reloader/0.log" Oct 02 07:36:01 crc kubenswrapper[4786]: I1002 07:36:01.144740 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-85846c6c54-9wx2x_42b1054f-e5f7-4d21-a4a8-98bcb85946c5/manager/0.log" Oct 02 07:36:01 crc kubenswrapper[4786]: I1002 07:36:01.285899 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6765d85898-4g7pf_b29b896b-61fb-40e7-80ce-d87fc031e3ae/webhook-server/0.log" Oct 02 07:36:01 crc kubenswrapper[4786]: I1002 07:36:01.310389 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6drsl_3b91fb80-e839-4d96-aa9a-4e08642aafe1/kube-rbac-proxy/0.log" Oct 02 07:36:01 crc kubenswrapper[4786]: I1002 07:36:01.760033 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6drsl_3b91fb80-e839-4d96-aa9a-4e08642aafe1/speaker/0.log" Oct 02 07:36:01 crc kubenswrapper[4786]: I1002 07:36:01.776344 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/frr/0.log" Oct 02 07:36:02 crc kubenswrapper[4786]: I1002 07:36:02.179747 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:36:02 crc kubenswrapper[4786]: E1002 07:36:02.180087 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:36:09 crc kubenswrapper[4786]: I1002 07:36:09.076212 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf_1bee82fb-73b9-40ee-800d-9b85d84324d6/util/0.log" Oct 02 07:36:09 crc kubenswrapper[4786]: I1002 07:36:09.212002 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf_1bee82fb-73b9-40ee-800d-9b85d84324d6/pull/0.log" Oct 02 07:36:09 crc kubenswrapper[4786]: I1002 07:36:09.227477 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf_1bee82fb-73b9-40ee-800d-9b85d84324d6/util/0.log" Oct 02 07:36:09 crc kubenswrapper[4786]: I1002 07:36:09.248734 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf_1bee82fb-73b9-40ee-800d-9b85d84324d6/pull/0.log" Oct 02 07:36:09 crc kubenswrapper[4786]: I1002 07:36:09.391354 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf_1bee82fb-73b9-40ee-800d-9b85d84324d6/pull/0.log" Oct 02 07:36:09 crc kubenswrapper[4786]: I1002 07:36:09.394886 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf_1bee82fb-73b9-40ee-800d-9b85d84324d6/util/0.log" Oct 02 07:36:09 crc kubenswrapper[4786]: I1002 07:36:09.407081 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf_1bee82fb-73b9-40ee-800d-9b85d84324d6/extract/0.log" Oct 02 07:36:09 crc kubenswrapper[4786]: I1002 07:36:09.518868 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqpjl_32e6a72b-b5cb-4538-8e54-0a48ad8b88a0/extract-utilities/0.log" Oct 02 07:36:09 crc kubenswrapper[4786]: I1002 07:36:09.623889 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqpjl_32e6a72b-b5cb-4538-8e54-0a48ad8b88a0/extract-utilities/0.log" Oct 02 07:36:09 crc kubenswrapper[4786]: I1002 07:36:09.654728 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqpjl_32e6a72b-b5cb-4538-8e54-0a48ad8b88a0/extract-content/0.log" Oct 02 07:36:09 crc kubenswrapper[4786]: I1002 07:36:09.661109 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqpjl_32e6a72b-b5cb-4538-8e54-0a48ad8b88a0/extract-content/0.log" Oct 02 07:36:09 crc kubenswrapper[4786]: I1002 07:36:09.758840 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqpjl_32e6a72b-b5cb-4538-8e54-0a48ad8b88a0/extract-content/0.log" Oct 02 07:36:09 crc kubenswrapper[4786]: I1002 07:36:09.760237 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqpjl_32e6a72b-b5cb-4538-8e54-0a48ad8b88a0/extract-utilities/0.log" Oct 02 07:36:09 crc kubenswrapper[4786]: I1002 07:36:09.949361 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lj274_2240fee8-3d4b-45e9-b7a1-242d5102f56e/extract-utilities/0.log" Oct 02 07:36:09 crc kubenswrapper[4786]: I1002 07:36:09.958496 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqpjl_32e6a72b-b5cb-4538-8e54-0a48ad8b88a0/registry-server/0.log" Oct 02 07:36:10 crc kubenswrapper[4786]: I1002 07:36:10.032033 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lj274_2240fee8-3d4b-45e9-b7a1-242d5102f56e/extract-utilities/0.log" Oct 02 07:36:10 crc kubenswrapper[4786]: I1002 07:36:10.040896 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lj274_2240fee8-3d4b-45e9-b7a1-242d5102f56e/extract-content/0.log" Oct 02 07:36:10 crc kubenswrapper[4786]: I1002 07:36:10.065983 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lj274_2240fee8-3d4b-45e9-b7a1-242d5102f56e/extract-content/0.log" Oct 02 07:36:10 crc kubenswrapper[4786]: I1002 07:36:10.188431 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lj274_2240fee8-3d4b-45e9-b7a1-242d5102f56e/extract-content/0.log" Oct 02 07:36:10 crc kubenswrapper[4786]: I1002 07:36:10.211658 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lj274_2240fee8-3d4b-45e9-b7a1-242d5102f56e/extract-utilities/0.log" Oct 02 07:36:10 crc kubenswrapper[4786]: I1002 07:36:10.300300 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lj274_2240fee8-3d4b-45e9-b7a1-242d5102f56e/registry-server/0.log" Oct 02 07:36:10 crc kubenswrapper[4786]: I1002 07:36:10.338502 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_5490b488-b520-4906-92b6-b13a997075fb/util/0.log" Oct 02 07:36:10 crc kubenswrapper[4786]: I1002 07:36:10.466753 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_5490b488-b520-4906-92b6-b13a997075fb/util/0.log" Oct 02 07:36:10 crc kubenswrapper[4786]: I1002 07:36:10.469650 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_5490b488-b520-4906-92b6-b13a997075fb/pull/0.log" Oct 02 07:36:10 crc kubenswrapper[4786]: I1002 07:36:10.471538 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_5490b488-b520-4906-92b6-b13a997075fb/pull/0.log" Oct 02 07:36:10 crc kubenswrapper[4786]: I1002 07:36:10.585102 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_5490b488-b520-4906-92b6-b13a997075fb/pull/0.log" Oct 02 07:36:10 crc kubenswrapper[4786]: I1002 07:36:10.599138 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_5490b488-b520-4906-92b6-b13a997075fb/util/0.log" Oct 02 07:36:10 crc kubenswrapper[4786]: I1002 07:36:10.614104 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_5490b488-b520-4906-92b6-b13a997075fb/extract/0.log" Oct 02 07:36:10 crc kubenswrapper[4786]: I1002 07:36:10.748816 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-tq2t2_53de3bf0-46ae-4969-a69e-2ad45e207407/marketplace-operator/0.log" Oct 02 07:36:10 crc kubenswrapper[4786]: I1002 07:36:10.752615 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rtbzr_cdc7bd8b-8dab-48d7-a338-a6eb79d14c13/extract-utilities/0.log" Oct 02 07:36:10 crc kubenswrapper[4786]: I1002 07:36:10.872392 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rtbzr_cdc7bd8b-8dab-48d7-a338-a6eb79d14c13/extract-utilities/0.log" Oct 02 07:36:10 crc kubenswrapper[4786]: I1002 07:36:10.887317 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rtbzr_cdc7bd8b-8dab-48d7-a338-a6eb79d14c13/extract-content/0.log" Oct 02 07:36:10 crc kubenswrapper[4786]: I1002 07:36:10.896049 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rtbzr_cdc7bd8b-8dab-48d7-a338-a6eb79d14c13/extract-content/0.log" Oct 02 07:36:11 crc kubenswrapper[4786]: I1002 07:36:11.009801 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rtbzr_cdc7bd8b-8dab-48d7-a338-a6eb79d14c13/extract-utilities/0.log" Oct 02 07:36:11 crc kubenswrapper[4786]: I1002 07:36:11.011826 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rtbzr_cdc7bd8b-8dab-48d7-a338-a6eb79d14c13/extract-content/0.log" Oct 02 07:36:11 crc kubenswrapper[4786]: I1002 07:36:11.111456 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rtbzr_cdc7bd8b-8dab-48d7-a338-a6eb79d14c13/registry-server/0.log" Oct 02 07:36:11 crc kubenswrapper[4786]: I1002 07:36:11.147882 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qmvfz_88d3da98-14ac-4121-af09-590caee1d21e/extract-utilities/0.log" Oct 02 07:36:11 crc kubenswrapper[4786]: I1002 07:36:11.296357 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qmvfz_88d3da98-14ac-4121-af09-590caee1d21e/extract-utilities/0.log" Oct 02 07:36:11 crc kubenswrapper[4786]: I1002 07:36:11.308868 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qmvfz_88d3da98-14ac-4121-af09-590caee1d21e/extract-content/0.log" Oct 02 07:36:11 crc kubenswrapper[4786]: I1002 07:36:11.309528 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qmvfz_88d3da98-14ac-4121-af09-590caee1d21e/extract-content/0.log" Oct 02 07:36:11 crc kubenswrapper[4786]: I1002 07:36:11.468861 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qmvfz_88d3da98-14ac-4121-af09-590caee1d21e/extract-utilities/0.log" Oct 02 07:36:11 crc kubenswrapper[4786]: I1002 07:36:11.470468 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qmvfz_88d3da98-14ac-4121-af09-590caee1d21e/extract-content/0.log" Oct 02 07:36:11 crc kubenswrapper[4786]: I1002 07:36:11.799960 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qmvfz_88d3da98-14ac-4121-af09-590caee1d21e/registry-server/0.log" Oct 02 07:36:13 crc kubenswrapper[4786]: I1002 07:36:13.179382 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:36:13 crc kubenswrapper[4786]: E1002 07:36:13.179588 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:36:26 crc kubenswrapper[4786]: I1002 07:36:26.178946 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:36:26 crc kubenswrapper[4786]: E1002 07:36:26.179492 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:36:39 crc kubenswrapper[4786]: I1002 07:36:39.179503 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:36:39 crc kubenswrapper[4786]: E1002 07:36:39.180097 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:36:53 crc kubenswrapper[4786]: I1002 07:36:53.179548 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:36:53 crc kubenswrapper[4786]: E1002 07:36:53.180060 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:37:06 crc kubenswrapper[4786]: I1002 07:37:06.178849 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:37:06 crc kubenswrapper[4786]: E1002 07:37:06.179382 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:37:18 crc kubenswrapper[4786]: I1002 07:37:18.183502 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:37:18 crc kubenswrapper[4786]: E1002 07:37:18.183993 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:37:31 crc kubenswrapper[4786]: I1002 07:37:31.180734 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:37:31 crc kubenswrapper[4786]: E1002 07:37:31.181442 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:37:34 crc kubenswrapper[4786]: I1002 07:37:34.948217 4786 generic.go:334] "Generic (PLEG): container finished" podID="d32bb154-ceb0-4098-b783-4be98ec66836" containerID="59c86f013e6393e14b59836c95a35227b5e124326a75372e2d7f573de08efcb9" exitCode=0 Oct 02 07:37:34 crc kubenswrapper[4786]: I1002 07:37:34.948321 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7wpts/must-gather-pk9zt" event={"ID":"d32bb154-ceb0-4098-b783-4be98ec66836","Type":"ContainerDied","Data":"59c86f013e6393e14b59836c95a35227b5e124326a75372e2d7f573de08efcb9"} Oct 02 07:37:34 crc kubenswrapper[4786]: I1002 07:37:34.949464 4786 scope.go:117] "RemoveContainer" containerID="59c86f013e6393e14b59836c95a35227b5e124326a75372e2d7f573de08efcb9" Oct 02 07:37:35 crc kubenswrapper[4786]: I1002 07:37:35.432255 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7wpts_must-gather-pk9zt_d32bb154-ceb0-4098-b783-4be98ec66836/gather/0.log" Oct 02 07:37:42 crc kubenswrapper[4786]: I1002 07:37:42.590961 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7wpts/must-gather-pk9zt"] Oct 02 07:37:42 crc kubenswrapper[4786]: I1002 07:37:42.591507 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7wpts/must-gather-pk9zt" podUID="d32bb154-ceb0-4098-b783-4be98ec66836" containerName="copy" containerID="cri-o://b654421e4f34feaf10fedbbfbed93511170fd8e653cf3073af7dd4d5ccd934ae" gracePeriod=2 Oct 02 07:37:42 crc kubenswrapper[4786]: I1002 07:37:42.597901 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7wpts/must-gather-pk9zt"] Oct 02 07:37:42 crc kubenswrapper[4786]: I1002 07:37:42.929263 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7wpts_must-gather-pk9zt_d32bb154-ceb0-4098-b783-4be98ec66836/copy/0.log" Oct 02 07:37:42 crc kubenswrapper[4786]: I1002 07:37:42.929876 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wpts/must-gather-pk9zt" Oct 02 07:37:43 crc kubenswrapper[4786]: I1002 07:37:43.006981 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7wpts_must-gather-pk9zt_d32bb154-ceb0-4098-b783-4be98ec66836/copy/0.log" Oct 02 07:37:43 crc kubenswrapper[4786]: I1002 07:37:43.007272 4786 generic.go:334] "Generic (PLEG): container finished" podID="d32bb154-ceb0-4098-b783-4be98ec66836" containerID="b654421e4f34feaf10fedbbfbed93511170fd8e653cf3073af7dd4d5ccd934ae" exitCode=143 Oct 02 07:37:43 crc kubenswrapper[4786]: I1002 07:37:43.007313 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7wpts/must-gather-pk9zt" Oct 02 07:37:43 crc kubenswrapper[4786]: I1002 07:37:43.007320 4786 scope.go:117] "RemoveContainer" containerID="b654421e4f34feaf10fedbbfbed93511170fd8e653cf3073af7dd4d5ccd934ae" Oct 02 07:37:43 crc kubenswrapper[4786]: I1002 07:37:43.021554 4786 scope.go:117] "RemoveContainer" containerID="59c86f013e6393e14b59836c95a35227b5e124326a75372e2d7f573de08efcb9" Oct 02 07:37:43 crc kubenswrapper[4786]: I1002 07:37:43.068248 4786 scope.go:117] "RemoveContainer" containerID="b654421e4f34feaf10fedbbfbed93511170fd8e653cf3073af7dd4d5ccd934ae" Oct 02 07:37:43 crc kubenswrapper[4786]: E1002 07:37:43.068620 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b654421e4f34feaf10fedbbfbed93511170fd8e653cf3073af7dd4d5ccd934ae\": container with ID starting with b654421e4f34feaf10fedbbfbed93511170fd8e653cf3073af7dd4d5ccd934ae not found: ID does not exist" containerID="b654421e4f34feaf10fedbbfbed93511170fd8e653cf3073af7dd4d5ccd934ae" Oct 02 07:37:43 crc kubenswrapper[4786]: I1002 07:37:43.068656 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b654421e4f34feaf10fedbbfbed93511170fd8e653cf3073af7dd4d5ccd934ae"} err="failed to get container status \"b654421e4f34feaf10fedbbfbed93511170fd8e653cf3073af7dd4d5ccd934ae\": rpc error: code = NotFound desc = could not find container \"b654421e4f34feaf10fedbbfbed93511170fd8e653cf3073af7dd4d5ccd934ae\": container with ID starting with b654421e4f34feaf10fedbbfbed93511170fd8e653cf3073af7dd4d5ccd934ae not found: ID does not exist" Oct 02 07:37:43 crc kubenswrapper[4786]: I1002 07:37:43.068681 4786 scope.go:117] "RemoveContainer" containerID="59c86f013e6393e14b59836c95a35227b5e124326a75372e2d7f573de08efcb9" Oct 02 07:37:43 crc kubenswrapper[4786]: E1002 07:37:43.068970 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59c86f013e6393e14b59836c95a35227b5e124326a75372e2d7f573de08efcb9\": container with ID starting with 59c86f013e6393e14b59836c95a35227b5e124326a75372e2d7f573de08efcb9 not found: ID does not exist" containerID="59c86f013e6393e14b59836c95a35227b5e124326a75372e2d7f573de08efcb9" Oct 02 07:37:43 crc kubenswrapper[4786]: I1002 07:37:43.069003 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59c86f013e6393e14b59836c95a35227b5e124326a75372e2d7f573de08efcb9"} err="failed to get container status \"59c86f013e6393e14b59836c95a35227b5e124326a75372e2d7f573de08efcb9\": rpc error: code = NotFound desc = could not find container \"59c86f013e6393e14b59836c95a35227b5e124326a75372e2d7f573de08efcb9\": container with ID starting with 59c86f013e6393e14b59836c95a35227b5e124326a75372e2d7f573de08efcb9 not found: ID does not exist" Oct 02 07:37:43 crc kubenswrapper[4786]: I1002 07:37:43.110324 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkpd5\" (UniqueName: \"kubernetes.io/projected/d32bb154-ceb0-4098-b783-4be98ec66836-kube-api-access-dkpd5\") pod \"d32bb154-ceb0-4098-b783-4be98ec66836\" (UID: \"d32bb154-ceb0-4098-b783-4be98ec66836\") " Oct 02 07:37:43 crc kubenswrapper[4786]: I1002 07:37:43.110445 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d32bb154-ceb0-4098-b783-4be98ec66836-must-gather-output\") pod \"d32bb154-ceb0-4098-b783-4be98ec66836\" (UID: \"d32bb154-ceb0-4098-b783-4be98ec66836\") " Oct 02 07:37:43 crc kubenswrapper[4786]: I1002 07:37:43.114780 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d32bb154-ceb0-4098-b783-4be98ec66836-kube-api-access-dkpd5" (OuterVolumeSpecName: "kube-api-access-dkpd5") pod "d32bb154-ceb0-4098-b783-4be98ec66836" (UID: "d32bb154-ceb0-4098-b783-4be98ec66836"). InnerVolumeSpecName "kube-api-access-dkpd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:37:43 crc kubenswrapper[4786]: I1002 07:37:43.212303 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkpd5\" (UniqueName: \"kubernetes.io/projected/d32bb154-ceb0-4098-b783-4be98ec66836-kube-api-access-dkpd5\") on node \"crc\" DevicePath \"\"" Oct 02 07:37:43 crc kubenswrapper[4786]: I1002 07:37:43.218919 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d32bb154-ceb0-4098-b783-4be98ec66836-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d32bb154-ceb0-4098-b783-4be98ec66836" (UID: "d32bb154-ceb0-4098-b783-4be98ec66836"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:37:43 crc kubenswrapper[4786]: I1002 07:37:43.313755 4786 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d32bb154-ceb0-4098-b783-4be98ec66836-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 02 07:37:44 crc kubenswrapper[4786]: I1002 07:37:44.180151 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:37:44 crc kubenswrapper[4786]: E1002 07:37:44.180520 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:37:44 crc kubenswrapper[4786]: I1002 07:37:44.187971 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d32bb154-ceb0-4098-b783-4be98ec66836" path="/var/lib/kubelet/pods/d32bb154-ceb0-4098-b783-4be98ec66836/volumes" Oct 02 07:37:58 crc kubenswrapper[4786]: I1002 07:37:58.179331 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:37:58 crc kubenswrapper[4786]: E1002 07:37:58.179996 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:38:10 crc kubenswrapper[4786]: I1002 07:38:10.189429 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:38:10 crc kubenswrapper[4786]: E1002 07:38:10.190187 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:38:11 crc kubenswrapper[4786]: I1002 07:38:11.689222 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mp7t7/must-gather-td5lb"] Oct 02 07:38:11 crc kubenswrapper[4786]: E1002 07:38:11.689763 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d32bb154-ceb0-4098-b783-4be98ec66836" containerName="copy" Oct 02 07:38:11 crc kubenswrapper[4786]: I1002 07:38:11.689776 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d32bb154-ceb0-4098-b783-4be98ec66836" containerName="copy" Oct 02 07:38:11 crc kubenswrapper[4786]: E1002 07:38:11.689802 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a14503c4-6df4-42bf-99f8-e8dfefb7680a" containerName="container-00" Oct 02 07:38:11 crc kubenswrapper[4786]: I1002 07:38:11.689808 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="a14503c4-6df4-42bf-99f8-e8dfefb7680a" containerName="container-00" Oct 02 07:38:11 crc kubenswrapper[4786]: E1002 07:38:11.689830 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d32bb154-ceb0-4098-b783-4be98ec66836" containerName="gather" Oct 02 07:38:11 crc kubenswrapper[4786]: I1002 07:38:11.689837 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="d32bb154-ceb0-4098-b783-4be98ec66836" containerName="gather" Oct 02 07:38:11 crc kubenswrapper[4786]: I1002 07:38:11.690017 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="a14503c4-6df4-42bf-99f8-e8dfefb7680a" containerName="container-00" Oct 02 07:38:11 crc kubenswrapper[4786]: I1002 07:38:11.690028 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d32bb154-ceb0-4098-b783-4be98ec66836" containerName="copy" Oct 02 07:38:11 crc kubenswrapper[4786]: I1002 07:38:11.690034 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="d32bb154-ceb0-4098-b783-4be98ec66836" containerName="gather" Oct 02 07:38:11 crc kubenswrapper[4786]: I1002 07:38:11.691742 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mp7t7/must-gather-td5lb" Oct 02 07:38:11 crc kubenswrapper[4786]: I1002 07:38:11.695090 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mp7t7"/"kube-root-ca.crt" Oct 02 07:38:11 crc kubenswrapper[4786]: I1002 07:38:11.695285 4786 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mp7t7"/"openshift-service-ca.crt" Oct 02 07:38:11 crc kubenswrapper[4786]: I1002 07:38:11.716428 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mp7t7/must-gather-td5lb"] Oct 02 07:38:11 crc kubenswrapper[4786]: I1002 07:38:11.758237 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rktn7\" (UniqueName: \"kubernetes.io/projected/c7dd6349-7b47-45a9-bfca-e8d3080d3353-kube-api-access-rktn7\") pod \"must-gather-td5lb\" (UID: \"c7dd6349-7b47-45a9-bfca-e8d3080d3353\") " pod="openshift-must-gather-mp7t7/must-gather-td5lb" Oct 02 07:38:11 crc kubenswrapper[4786]: I1002 07:38:11.758453 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7dd6349-7b47-45a9-bfca-e8d3080d3353-must-gather-output\") pod \"must-gather-td5lb\" (UID: \"c7dd6349-7b47-45a9-bfca-e8d3080d3353\") " pod="openshift-must-gather-mp7t7/must-gather-td5lb" Oct 02 07:38:11 crc kubenswrapper[4786]: I1002 07:38:11.859961 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7dd6349-7b47-45a9-bfca-e8d3080d3353-must-gather-output\") pod \"must-gather-td5lb\" (UID: \"c7dd6349-7b47-45a9-bfca-e8d3080d3353\") " pod="openshift-must-gather-mp7t7/must-gather-td5lb" Oct 02 07:38:11 crc kubenswrapper[4786]: I1002 07:38:11.860170 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rktn7\" (UniqueName: \"kubernetes.io/projected/c7dd6349-7b47-45a9-bfca-e8d3080d3353-kube-api-access-rktn7\") pod \"must-gather-td5lb\" (UID: \"c7dd6349-7b47-45a9-bfca-e8d3080d3353\") " pod="openshift-must-gather-mp7t7/must-gather-td5lb" Oct 02 07:38:11 crc kubenswrapper[4786]: I1002 07:38:11.860418 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7dd6349-7b47-45a9-bfca-e8d3080d3353-must-gather-output\") pod \"must-gather-td5lb\" (UID: \"c7dd6349-7b47-45a9-bfca-e8d3080d3353\") " pod="openshift-must-gather-mp7t7/must-gather-td5lb" Oct 02 07:38:11 crc kubenswrapper[4786]: I1002 07:38:11.874104 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rktn7\" (UniqueName: \"kubernetes.io/projected/c7dd6349-7b47-45a9-bfca-e8d3080d3353-kube-api-access-rktn7\") pod \"must-gather-td5lb\" (UID: \"c7dd6349-7b47-45a9-bfca-e8d3080d3353\") " pod="openshift-must-gather-mp7t7/must-gather-td5lb" Oct 02 07:38:12 crc kubenswrapper[4786]: I1002 07:38:12.006884 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mp7t7/must-gather-td5lb" Oct 02 07:38:12 crc kubenswrapper[4786]: I1002 07:38:12.374102 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mp7t7/must-gather-td5lb"] Oct 02 07:38:13 crc kubenswrapper[4786]: I1002 07:38:13.192313 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mp7t7/must-gather-td5lb" event={"ID":"c7dd6349-7b47-45a9-bfca-e8d3080d3353","Type":"ContainerStarted","Data":"c1174d80cfdbae804e93da5c2a4f688ea8778ee89cd2c5b5764be478b6e1bd0b"} Oct 02 07:38:13 crc kubenswrapper[4786]: I1002 07:38:13.192520 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mp7t7/must-gather-td5lb" event={"ID":"c7dd6349-7b47-45a9-bfca-e8d3080d3353","Type":"ContainerStarted","Data":"67385f0d4d5e92d84c80b095266c0e6893cf4009f30ad55355729b5dfd6f73c5"} Oct 02 07:38:13 crc kubenswrapper[4786]: I1002 07:38:13.192535 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mp7t7/must-gather-td5lb" event={"ID":"c7dd6349-7b47-45a9-bfca-e8d3080d3353","Type":"ContainerStarted","Data":"bd35286938b1a116df4d36ce7e27ef4779bcd82d0551d1e6261e358a5a39890a"} Oct 02 07:38:13 crc kubenswrapper[4786]: I1002 07:38:13.204198 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mp7t7/must-gather-td5lb" podStartSLOduration=2.204185143 podStartE2EDuration="2.204185143s" podCreationTimestamp="2025-10-02 07:38:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:38:13.20061993 +0000 UTC m=+3103.321803071" watchObservedRunningTime="2025-10-02 07:38:13.204185143 +0000 UTC m=+3103.325368274" Oct 02 07:38:15 crc kubenswrapper[4786]: I1002 07:38:15.056675 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mp7t7/crc-debug-tc27p"] Oct 02 07:38:15 crc kubenswrapper[4786]: I1002 07:38:15.057884 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mp7t7/crc-debug-tc27p" Oct 02 07:38:15 crc kubenswrapper[4786]: I1002 07:38:15.059727 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mp7t7"/"default-dockercfg-479mm" Oct 02 07:38:15 crc kubenswrapper[4786]: I1002 07:38:15.115284 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29031843-b2ec-493b-b21d-82ca3f857864-host\") pod \"crc-debug-tc27p\" (UID: \"29031843-b2ec-493b-b21d-82ca3f857864\") " pod="openshift-must-gather-mp7t7/crc-debug-tc27p" Oct 02 07:38:15 crc kubenswrapper[4786]: I1002 07:38:15.115497 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj5w4\" (UniqueName: \"kubernetes.io/projected/29031843-b2ec-493b-b21d-82ca3f857864-kube-api-access-sj5w4\") pod \"crc-debug-tc27p\" (UID: \"29031843-b2ec-493b-b21d-82ca3f857864\") " pod="openshift-must-gather-mp7t7/crc-debug-tc27p" Oct 02 07:38:15 crc kubenswrapper[4786]: I1002 07:38:15.216610 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29031843-b2ec-493b-b21d-82ca3f857864-host\") pod \"crc-debug-tc27p\" (UID: \"29031843-b2ec-493b-b21d-82ca3f857864\") " pod="openshift-must-gather-mp7t7/crc-debug-tc27p" Oct 02 07:38:15 crc kubenswrapper[4786]: I1002 07:38:15.216760 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj5w4\" (UniqueName: \"kubernetes.io/projected/29031843-b2ec-493b-b21d-82ca3f857864-kube-api-access-sj5w4\") pod \"crc-debug-tc27p\" (UID: \"29031843-b2ec-493b-b21d-82ca3f857864\") " pod="openshift-must-gather-mp7t7/crc-debug-tc27p" Oct 02 07:38:15 crc kubenswrapper[4786]: I1002 07:38:15.216779 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29031843-b2ec-493b-b21d-82ca3f857864-host\") pod \"crc-debug-tc27p\" (UID: \"29031843-b2ec-493b-b21d-82ca3f857864\") " pod="openshift-must-gather-mp7t7/crc-debug-tc27p" Oct 02 07:38:15 crc kubenswrapper[4786]: I1002 07:38:15.234246 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj5w4\" (UniqueName: \"kubernetes.io/projected/29031843-b2ec-493b-b21d-82ca3f857864-kube-api-access-sj5w4\") pod \"crc-debug-tc27p\" (UID: \"29031843-b2ec-493b-b21d-82ca3f857864\") " pod="openshift-must-gather-mp7t7/crc-debug-tc27p" Oct 02 07:38:15 crc kubenswrapper[4786]: I1002 07:38:15.372987 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mp7t7/crc-debug-tc27p" Oct 02 07:38:15 crc kubenswrapper[4786]: W1002 07:38:15.400552 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29031843_b2ec_493b_b21d_82ca3f857864.slice/crio-eab2430f0e6f39940bdc871868409f6b2e5a895de07fa08e208d6058198ff110 WatchSource:0}: Error finding container eab2430f0e6f39940bdc871868409f6b2e5a895de07fa08e208d6058198ff110: Status 404 returned error can't find the container with id eab2430f0e6f39940bdc871868409f6b2e5a895de07fa08e208d6058198ff110 Oct 02 07:38:16 crc kubenswrapper[4786]: I1002 07:38:16.212004 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mp7t7/crc-debug-tc27p" event={"ID":"29031843-b2ec-493b-b21d-82ca3f857864","Type":"ContainerStarted","Data":"edd94b8d382fd65ad96082bb815701d548db4ee0f2816b6102118159cea4d954"} Oct 02 07:38:16 crc kubenswrapper[4786]: I1002 07:38:16.213016 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mp7t7/crc-debug-tc27p" event={"ID":"29031843-b2ec-493b-b21d-82ca3f857864","Type":"ContainerStarted","Data":"eab2430f0e6f39940bdc871868409f6b2e5a895de07fa08e208d6058198ff110"} Oct 02 07:38:16 crc kubenswrapper[4786]: I1002 07:38:16.227356 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mp7t7/crc-debug-tc27p" podStartSLOduration=1.22734452 podStartE2EDuration="1.22734452s" podCreationTimestamp="2025-10-02 07:38:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 07:38:16.22256557 +0000 UTC m=+3106.343748700" watchObservedRunningTime="2025-10-02 07:38:16.22734452 +0000 UTC m=+3106.348527651" Oct 02 07:38:25 crc kubenswrapper[4786]: I1002 07:38:25.180023 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:38:25 crc kubenswrapper[4786]: E1002 07:38:25.180747 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:38:37 crc kubenswrapper[4786]: I1002 07:38:37.180141 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:38:37 crc kubenswrapper[4786]: E1002 07:38:37.181070 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:38:51 crc kubenswrapper[4786]: I1002 07:38:51.179618 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:38:51 crc kubenswrapper[4786]: E1002 07:38:51.180223 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:38:59 crc kubenswrapper[4786]: I1002 07:38:59.915098 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6c4f87b654-hv865_ed22a27c-dae6-448c-b789-85add05aff31/barbican-api/0.log" Oct 02 07:38:59 crc kubenswrapper[4786]: I1002 07:38:59.932572 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6c4f87b654-hv865_ed22a27c-dae6-448c-b789-85add05aff31/barbican-api-log/0.log" Oct 02 07:39:00 crc kubenswrapper[4786]: I1002 07:39:00.046684 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-755cbf448b-g8s9l_812267df-c387-42a6-a6b4-758beccdd77d/barbican-keystone-listener/0.log" Oct 02 07:39:00 crc kubenswrapper[4786]: I1002 07:39:00.100530 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-755cbf448b-g8s9l_812267df-c387-42a6-a6b4-758beccdd77d/barbican-keystone-listener-log/0.log" Oct 02 07:39:00 crc kubenswrapper[4786]: I1002 07:39:00.187653 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5fc459f649-l589x_af54426b-d853-4806-bdf8-c1fd22cb6752/barbican-worker/0.log" Oct 02 07:39:00 crc kubenswrapper[4786]: I1002 07:39:00.236304 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5fc459f649-l589x_af54426b-d853-4806-bdf8-c1fd22cb6752/barbican-worker-log/0.log" Oct 02 07:39:00 crc kubenswrapper[4786]: I1002 07:39:00.366512 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-tx7kf_2bd6d4c9-950c-4051-a20e-cfae8655dab2/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:39:00 crc kubenswrapper[4786]: I1002 07:39:00.483563 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2bebe6de-020d-4c9d-b4ea-a3069dace1c8/ceilometer-central-agent/0.log" Oct 02 07:39:00 crc kubenswrapper[4786]: I1002 07:39:00.497166 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2bebe6de-020d-4c9d-b4ea-a3069dace1c8/ceilometer-notification-agent/0.log" Oct 02 07:39:00 crc kubenswrapper[4786]: I1002 07:39:00.533999 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2bebe6de-020d-4c9d-b4ea-a3069dace1c8/proxy-httpd/0.log" Oct 02 07:39:00 crc kubenswrapper[4786]: I1002 07:39:00.608430 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2bebe6de-020d-4c9d-b4ea-a3069dace1c8/sg-core/0.log" Oct 02 07:39:00 crc kubenswrapper[4786]: I1002 07:39:00.687364 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_546a4a1f-7f71-4714-8d8c-a012948427bb/cinder-api/0.log" Oct 02 07:39:00 crc kubenswrapper[4786]: I1002 07:39:00.750808 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_546a4a1f-7f71-4714-8d8c-a012948427bb/cinder-api-log/0.log" Oct 02 07:39:00 crc kubenswrapper[4786]: I1002 07:39:00.852471 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fda2a685-4713-46a8-9630-6d9d70f80bf7/cinder-scheduler/0.log" Oct 02 07:39:00 crc kubenswrapper[4786]: I1002 07:39:00.890555 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fda2a685-4713-46a8-9630-6d9d70f80bf7/probe/0.log" Oct 02 07:39:00 crc kubenswrapper[4786]: I1002 07:39:00.995287 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-z7xjq_2747d6cd-de52-43b7-a2d0-16c86deecd42/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:39:01 crc kubenswrapper[4786]: I1002 07:39:01.095986 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-bpgrp_169def9f-29e2-41fb-bf34-86464f366256/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:39:01 crc kubenswrapper[4786]: I1002 07:39:01.190638 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-rxb4g_d3db3504-6495-465c-8c96-90b80bdcb97e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:39:01 crc kubenswrapper[4786]: I1002 07:39:01.339925 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-fdcbb4567-8fktd_1fc331cd-16f8-41c1-8a54-7259f7c5fecb/init/0.log" Oct 02 07:39:01 crc kubenswrapper[4786]: I1002 07:39:01.429012 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-fdcbb4567-8fktd_1fc331cd-16f8-41c1-8a54-7259f7c5fecb/init/0.log" Oct 02 07:39:01 crc kubenswrapper[4786]: I1002 07:39:01.461310 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-fdcbb4567-8fktd_1fc331cd-16f8-41c1-8a54-7259f7c5fecb/dnsmasq-dns/0.log" Oct 02 07:39:01 crc kubenswrapper[4786]: I1002 07:39:01.590531 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-n6zvw_aa4315e2-27f5-4b58-91a6-1d5b683c6e55/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:39:01 crc kubenswrapper[4786]: I1002 07:39:01.705242 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cbc5c046-7b05-4041-bb1f-a9851dde1c79/glance-httpd/0.log" Oct 02 07:39:01 crc kubenswrapper[4786]: I1002 07:39:01.724119 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cbc5c046-7b05-4041-bb1f-a9851dde1c79/glance-log/0.log" Oct 02 07:39:01 crc kubenswrapper[4786]: I1002 07:39:01.847269 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_326311ac-c390-46ad-bdd7-29ce60f094bc/glance-httpd/0.log" Oct 02 07:39:01 crc kubenswrapper[4786]: I1002 07:39:01.854100 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_326311ac-c390-46ad-bdd7-29ce60f094bc/glance-log/0.log" Oct 02 07:39:01 crc kubenswrapper[4786]: I1002 07:39:01.963621 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-zxnn6_2b54f55b-d53c-449b-a2e3-6ca66ae19657/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:39:02 crc kubenswrapper[4786]: I1002 07:39:02.100911 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-7r8j5_4267f5c8-feae-4e65-a80a-ebb4c7003eaf/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:39:02 crc kubenswrapper[4786]: I1002 07:39:02.300949 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-749d95758-7hp9f_1c71cbf2-87c2-4bd2-a6c9-bd641976f8d7/keystone-api/0.log" Oct 02 07:39:02 crc kubenswrapper[4786]: I1002 07:39:02.309505 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e002986c-0138-4b99-98a9-3d2095810fb4/kube-state-metrics/0.log" Oct 02 07:39:02 crc kubenswrapper[4786]: I1002 07:39:02.437531 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-8qtp7_f597e524-6913-4445-ac68-00fb20d044b8/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:39:02 crc kubenswrapper[4786]: I1002 07:39:02.659901 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5c985b58cc-mrm2m_89685228-db75-441e-83ae-74720db4de72/neutron-api/0.log" Oct 02 07:39:02 crc kubenswrapper[4786]: I1002 07:39:02.669529 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5c985b58cc-mrm2m_89685228-db75-441e-83ae-74720db4de72/neutron-httpd/0.log" Oct 02 07:39:02 crc kubenswrapper[4786]: I1002 07:39:02.848212 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-z4tjt_5e5e53e7-0f7d-4d7a-b410-364982cf5311/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:39:03 crc kubenswrapper[4786]: I1002 07:39:03.178986 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:39:03 crc kubenswrapper[4786]: E1002 07:39:03.179520 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:39:03 crc kubenswrapper[4786]: I1002 07:39:03.214842 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5925ec23-28fa-42cb-9f3b-aa96d2efab12/nova-api-log/0.log" Oct 02 07:39:03 crc kubenswrapper[4786]: I1002 07:39:03.427418 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5925ec23-28fa-42cb-9f3b-aa96d2efab12/nova-api-api/0.log" Oct 02 07:39:03 crc kubenswrapper[4786]: I1002 07:39:03.484707 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_6ca74ccc-e756-4687-ad8e-7ffecd4b92f7/nova-cell0-conductor-conductor/0.log" Oct 02 07:39:03 crc kubenswrapper[4786]: I1002 07:39:03.753264 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9c68bf5e-fcbd-48d6-bcfb-f36d68381c09/nova-cell1-conductor-conductor/0.log" Oct 02 07:39:03 crc kubenswrapper[4786]: I1002 07:39:03.819178 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_841f7177-db9e-4e73-a7b4-ccf30d1538fb/nova-cell1-novncproxy-novncproxy/0.log" Oct 02 07:39:03 crc kubenswrapper[4786]: I1002 07:39:03.987388 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-qnbsm_0378af36-7eb4-4ef7-bf6a-dc2bd678c31e/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:39:04 crc kubenswrapper[4786]: I1002 07:39:04.118415 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_15256076-c838-461c-98d9-4ae4883be465/nova-metadata-log/0.log" Oct 02 07:39:04 crc kubenswrapper[4786]: I1002 07:39:04.494795 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_945deef9-0f62-4eeb-af6c-beb702d5458b/nova-scheduler-scheduler/0.log" Oct 02 07:39:04 crc kubenswrapper[4786]: I1002 07:39:04.567481 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e4c62a8c-eb69-4f97-a299-a44f87315f81/mysql-bootstrap/0.log" Oct 02 07:39:04 crc kubenswrapper[4786]: I1002 07:39:04.777452 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e4c62a8c-eb69-4f97-a299-a44f87315f81/mysql-bootstrap/0.log" Oct 02 07:39:04 crc kubenswrapper[4786]: I1002 07:39:04.783902 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e4c62a8c-eb69-4f97-a299-a44f87315f81/galera/0.log" Oct 02 07:39:04 crc kubenswrapper[4786]: I1002 07:39:04.990828 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_106dd908-3398-489e-a39f-b684b5eecd2b/mysql-bootstrap/0.log" Oct 02 07:39:05 crc kubenswrapper[4786]: I1002 07:39:05.163285 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_106dd908-3398-489e-a39f-b684b5eecd2b/mysql-bootstrap/0.log" Oct 02 07:39:05 crc kubenswrapper[4786]: I1002 07:39:05.169737 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_15256076-c838-461c-98d9-4ae4883be465/nova-metadata-metadata/0.log" Oct 02 07:39:05 crc kubenswrapper[4786]: I1002 07:39:05.219487 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_106dd908-3398-489e-a39f-b684b5eecd2b/galera/0.log" Oct 02 07:39:05 crc kubenswrapper[4786]: I1002 07:39:05.354064 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_f57411f4-01bc-4ecc-a240-9101e861f97d/openstackclient/0.log" Oct 02 07:39:05 crc kubenswrapper[4786]: I1002 07:39:05.554259 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-gl77n_d0b71af7-f7f4-45e8-b8f0-c7428f54a37d/ovn-controller/0.log" Oct 02 07:39:05 crc kubenswrapper[4786]: I1002 07:39:05.612180 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-jz66d_003620a3-92b9-4640-be9e-9a8b064fc888/openstack-network-exporter/0.log" Oct 02 07:39:05 crc kubenswrapper[4786]: I1002 07:39:05.749163 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lf7tj_452e4743-083e-420b-9dfc-ea81e1376373/ovsdb-server-init/0.log" Oct 02 07:39:05 crc kubenswrapper[4786]: I1002 07:39:05.916559 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lf7tj_452e4743-083e-420b-9dfc-ea81e1376373/ovsdb-server-init/0.log" Oct 02 07:39:05 crc kubenswrapper[4786]: I1002 07:39:05.973768 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lf7tj_452e4743-083e-420b-9dfc-ea81e1376373/ovsdb-server/0.log" Oct 02 07:39:05 crc kubenswrapper[4786]: I1002 07:39:05.978319 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lf7tj_452e4743-083e-420b-9dfc-ea81e1376373/ovs-vswitchd/0.log" Oct 02 07:39:06 crc kubenswrapper[4786]: I1002 07:39:06.170890 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-47j56_483f48a9-eb90-4a5b-aaa6-63e130859f16/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:39:06 crc kubenswrapper[4786]: I1002 07:39:06.255367 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e0006e64-1f5a-4050-83f6-36ce34d68bf7/openstack-network-exporter/0.log" Oct 02 07:39:06 crc kubenswrapper[4786]: I1002 07:39:06.329665 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e0006e64-1f5a-4050-83f6-36ce34d68bf7/ovn-northd/0.log" Oct 02 07:39:06 crc kubenswrapper[4786]: I1002 07:39:06.438549 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9/openstack-network-exporter/0.log" Oct 02 07:39:06 crc kubenswrapper[4786]: I1002 07:39:06.539818 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_11d8bc2f-6c0b-45d9-a1a9-54bc1222d0b9/ovsdbserver-nb/0.log" Oct 02 07:39:06 crc kubenswrapper[4786]: I1002 07:39:06.630862 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9edb2cb3-f47c-4f56-8181-1ec8f9d774f6/openstack-network-exporter/0.log" Oct 02 07:39:06 crc kubenswrapper[4786]: I1002 07:39:06.691016 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9edb2cb3-f47c-4f56-8181-1ec8f9d774f6/ovsdbserver-sb/0.log" Oct 02 07:39:06 crc kubenswrapper[4786]: I1002 07:39:06.893546 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8c555dcd8-dbpk5_07a06aef-d1a6-4093-ac4e-d6aa3ded6b60/placement-api/0.log" Oct 02 07:39:06 crc kubenswrapper[4786]: I1002 07:39:06.921907 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8c555dcd8-dbpk5_07a06aef-d1a6-4093-ac4e-d6aa3ded6b60/placement-log/0.log" Oct 02 07:39:07 crc kubenswrapper[4786]: I1002 07:39:07.069767 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_00373ed3-3b08-4040-9e05-fd042f541af6/setup-container/0.log" Oct 02 07:39:07 crc kubenswrapper[4786]: I1002 07:39:07.213870 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_00373ed3-3b08-4040-9e05-fd042f541af6/setup-container/0.log" Oct 02 07:39:07 crc kubenswrapper[4786]: I1002 07:39:07.224239 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_00373ed3-3b08-4040-9e05-fd042f541af6/rabbitmq/0.log" Oct 02 07:39:07 crc kubenswrapper[4786]: I1002 07:39:07.394872 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_581ccdf3-7a38-4bb7-93ac-035207098fb7/setup-container/0.log" Oct 02 07:39:07 crc kubenswrapper[4786]: I1002 07:39:07.594222 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_581ccdf3-7a38-4bb7-93ac-035207098fb7/rabbitmq/0.log" Oct 02 07:39:07 crc kubenswrapper[4786]: I1002 07:39:07.624235 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_581ccdf3-7a38-4bb7-93ac-035207098fb7/setup-container/0.log" Oct 02 07:39:07 crc kubenswrapper[4786]: I1002 07:39:07.725983 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-44zrf_1f6f8032-e0d0-460f-bc14-653a74481964/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:39:07 crc kubenswrapper[4786]: I1002 07:39:07.810885 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-jfhsk_daf8cf99-f61d-4fc6-bc92-045acafed529/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:39:07 crc kubenswrapper[4786]: I1002 07:39:07.974763 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-c879k_5b6240dd-f9f1-4ca7-8b04-8dc4e002fe11/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:39:08 crc kubenswrapper[4786]: I1002 07:39:08.081558 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-8hhjl_3fe8db1d-ac2c-4028-a843-a74d8e787543/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:39:08 crc kubenswrapper[4786]: I1002 07:39:08.170919 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-ck7z4_d54f4fa9-80d2-47dc-a156-b26cbc9ebd84/ssh-known-hosts-edpm-deployment/0.log" Oct 02 07:39:08 crc kubenswrapper[4786]: I1002 07:39:08.403081 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7dd886698c-blkwg_45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a/proxy-server/0.log" Oct 02 07:39:08 crc kubenswrapper[4786]: I1002 07:39:08.430308 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7dd886698c-blkwg_45f7fdd0-fd4b-47c0-bf23-3cf876c5e63a/proxy-httpd/0.log" Oct 02 07:39:08 crc kubenswrapper[4786]: I1002 07:39:08.590446 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-mjfnf_009aff59-5299-4ae9-a997-99c88f103a92/swift-ring-rebalance/0.log" Oct 02 07:39:08 crc kubenswrapper[4786]: I1002 07:39:08.831489 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/account-auditor/0.log" Oct 02 07:39:08 crc kubenswrapper[4786]: I1002 07:39:08.908207 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/account-reaper/0.log" Oct 02 07:39:08 crc kubenswrapper[4786]: I1002 07:39:08.998306 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/account-replicator/0.log" Oct 02 07:39:09 crc kubenswrapper[4786]: I1002 07:39:09.026387 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/account-server/0.log" Oct 02 07:39:09 crc kubenswrapper[4786]: I1002 07:39:09.057484 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/container-auditor/0.log" Oct 02 07:39:09 crc kubenswrapper[4786]: I1002 07:39:09.177198 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/container-replicator/0.log" Oct 02 07:39:09 crc kubenswrapper[4786]: I1002 07:39:09.214370 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/container-server/0.log" Oct 02 07:39:09 crc kubenswrapper[4786]: I1002 07:39:09.244121 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/container-updater/0.log" Oct 02 07:39:09 crc kubenswrapper[4786]: I1002 07:39:09.333959 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/object-auditor/0.log" Oct 02 07:39:09 crc kubenswrapper[4786]: I1002 07:39:09.389528 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/object-expirer/0.log" Oct 02 07:39:09 crc kubenswrapper[4786]: I1002 07:39:09.437999 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/object-replicator/0.log" Oct 02 07:39:09 crc kubenswrapper[4786]: I1002 07:39:09.532298 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/object-server/0.log" Oct 02 07:39:09 crc kubenswrapper[4786]: I1002 07:39:09.574955 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/object-updater/0.log" Oct 02 07:39:09 crc kubenswrapper[4786]: I1002 07:39:09.669852 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/rsync/0.log" Oct 02 07:39:09 crc kubenswrapper[4786]: I1002 07:39:09.694877 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_15dfcc2f-b58a-4238-a555-e0e5e4a05e4b/swift-recon-cron/0.log" Oct 02 07:39:09 crc kubenswrapper[4786]: I1002 07:39:09.904825 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-l2rrm_b470bf8f-a980-45f7-b488-eda5005abf20/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:39:09 crc kubenswrapper[4786]: I1002 07:39:09.922660 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_2b995083-72d4-4aaf-ad51-2e5ec5b6a4a9/tempest-tests-tempest-tests-runner/0.log" Oct 02 07:39:10 crc kubenswrapper[4786]: I1002 07:39:10.082795 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_a9e8dedb-8789-4b6e-9f4d-0f46ae66d27d/test-operator-logs-container/0.log" Oct 02 07:39:10 crc kubenswrapper[4786]: I1002 07:39:10.287302 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-pr6z8_18524406-101e-474c-853f-5674430d613f/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 02 07:39:14 crc kubenswrapper[4786]: I1002 07:39:14.179085 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:39:14 crc kubenswrapper[4786]: E1002 07:39:14.179825 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:39:17 crc kubenswrapper[4786]: I1002 07:39:17.748382 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_26e71aaf-fc49-4218-b001-433de642f9ae/memcached/0.log" Oct 02 07:39:26 crc kubenswrapper[4786]: I1002 07:39:26.179802 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:39:26 crc kubenswrapper[4786]: E1002 07:39:26.180508 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:39:37 crc kubenswrapper[4786]: I1002 07:39:37.777183 4786 generic.go:334] "Generic (PLEG): container finished" podID="29031843-b2ec-493b-b21d-82ca3f857864" containerID="edd94b8d382fd65ad96082bb815701d548db4ee0f2816b6102118159cea4d954" exitCode=0 Oct 02 07:39:37 crc kubenswrapper[4786]: I1002 07:39:37.777265 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mp7t7/crc-debug-tc27p" event={"ID":"29031843-b2ec-493b-b21d-82ca3f857864","Type":"ContainerDied","Data":"edd94b8d382fd65ad96082bb815701d548db4ee0f2816b6102118159cea4d954"} Oct 02 07:39:38 crc kubenswrapper[4786]: I1002 07:39:38.178902 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:39:38 crc kubenswrapper[4786]: E1002 07:39:38.179134 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:39:38 crc kubenswrapper[4786]: I1002 07:39:38.850986 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mp7t7/crc-debug-tc27p" Oct 02 07:39:38 crc kubenswrapper[4786]: I1002 07:39:38.876314 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mp7t7/crc-debug-tc27p"] Oct 02 07:39:38 crc kubenswrapper[4786]: I1002 07:39:38.882379 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mp7t7/crc-debug-tc27p"] Oct 02 07:39:38 crc kubenswrapper[4786]: I1002 07:39:38.960377 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj5w4\" (UniqueName: \"kubernetes.io/projected/29031843-b2ec-493b-b21d-82ca3f857864-kube-api-access-sj5w4\") pod \"29031843-b2ec-493b-b21d-82ca3f857864\" (UID: \"29031843-b2ec-493b-b21d-82ca3f857864\") " Oct 02 07:39:38 crc kubenswrapper[4786]: I1002 07:39:38.960487 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29031843-b2ec-493b-b21d-82ca3f857864-host\") pod \"29031843-b2ec-493b-b21d-82ca3f857864\" (UID: \"29031843-b2ec-493b-b21d-82ca3f857864\") " Oct 02 07:39:38 crc kubenswrapper[4786]: I1002 07:39:38.960557 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29031843-b2ec-493b-b21d-82ca3f857864-host" (OuterVolumeSpecName: "host") pod "29031843-b2ec-493b-b21d-82ca3f857864" (UID: "29031843-b2ec-493b-b21d-82ca3f857864"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 07:39:38 crc kubenswrapper[4786]: I1002 07:39:38.960949 4786 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29031843-b2ec-493b-b21d-82ca3f857864-host\") on node \"crc\" DevicePath \"\"" Oct 02 07:39:38 crc kubenswrapper[4786]: I1002 07:39:38.965026 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29031843-b2ec-493b-b21d-82ca3f857864-kube-api-access-sj5w4" (OuterVolumeSpecName: "kube-api-access-sj5w4") pod "29031843-b2ec-493b-b21d-82ca3f857864" (UID: "29031843-b2ec-493b-b21d-82ca3f857864"). InnerVolumeSpecName "kube-api-access-sj5w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:39:39 crc kubenswrapper[4786]: I1002 07:39:39.062345 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj5w4\" (UniqueName: \"kubernetes.io/projected/29031843-b2ec-493b-b21d-82ca3f857864-kube-api-access-sj5w4\") on node \"crc\" DevicePath \"\"" Oct 02 07:39:39 crc kubenswrapper[4786]: I1002 07:39:39.790505 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eab2430f0e6f39940bdc871868409f6b2e5a895de07fa08e208d6058198ff110" Oct 02 07:39:39 crc kubenswrapper[4786]: I1002 07:39:39.790547 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mp7t7/crc-debug-tc27p" Oct 02 07:39:39 crc kubenswrapper[4786]: I1002 07:39:39.985101 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mp7t7/crc-debug-bgsbq"] Oct 02 07:39:39 crc kubenswrapper[4786]: E1002 07:39:39.985414 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29031843-b2ec-493b-b21d-82ca3f857864" containerName="container-00" Oct 02 07:39:39 crc kubenswrapper[4786]: I1002 07:39:39.985425 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="29031843-b2ec-493b-b21d-82ca3f857864" containerName="container-00" Oct 02 07:39:39 crc kubenswrapper[4786]: I1002 07:39:39.985597 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="29031843-b2ec-493b-b21d-82ca3f857864" containerName="container-00" Oct 02 07:39:39 crc kubenswrapper[4786]: I1002 07:39:39.986097 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mp7t7/crc-debug-bgsbq" Oct 02 07:39:39 crc kubenswrapper[4786]: I1002 07:39:39.987270 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mp7t7"/"default-dockercfg-479mm" Oct 02 07:39:40 crc kubenswrapper[4786]: I1002 07:39:40.075920 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc0fa37b-06fe-48a5-aec3-9dd0a91922fa-host\") pod \"crc-debug-bgsbq\" (UID: \"fc0fa37b-06fe-48a5-aec3-9dd0a91922fa\") " pod="openshift-must-gather-mp7t7/crc-debug-bgsbq" Oct 02 07:39:40 crc kubenswrapper[4786]: I1002 07:39:40.076384 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phht9\" (UniqueName: \"kubernetes.io/projected/fc0fa37b-06fe-48a5-aec3-9dd0a91922fa-kube-api-access-phht9\") pod \"crc-debug-bgsbq\" (UID: \"fc0fa37b-06fe-48a5-aec3-9dd0a91922fa\") " pod="openshift-must-gather-mp7t7/crc-debug-bgsbq" Oct 02 07:39:40 crc kubenswrapper[4786]: I1002 07:39:40.177558 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phht9\" (UniqueName: \"kubernetes.io/projected/fc0fa37b-06fe-48a5-aec3-9dd0a91922fa-kube-api-access-phht9\") pod \"crc-debug-bgsbq\" (UID: \"fc0fa37b-06fe-48a5-aec3-9dd0a91922fa\") " pod="openshift-must-gather-mp7t7/crc-debug-bgsbq" Oct 02 07:39:40 crc kubenswrapper[4786]: I1002 07:39:40.177632 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc0fa37b-06fe-48a5-aec3-9dd0a91922fa-host\") pod \"crc-debug-bgsbq\" (UID: \"fc0fa37b-06fe-48a5-aec3-9dd0a91922fa\") " pod="openshift-must-gather-mp7t7/crc-debug-bgsbq" Oct 02 07:39:40 crc kubenswrapper[4786]: I1002 07:39:40.177760 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc0fa37b-06fe-48a5-aec3-9dd0a91922fa-host\") pod \"crc-debug-bgsbq\" (UID: \"fc0fa37b-06fe-48a5-aec3-9dd0a91922fa\") " pod="openshift-must-gather-mp7t7/crc-debug-bgsbq" Oct 02 07:39:40 crc kubenswrapper[4786]: I1002 07:39:40.187151 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29031843-b2ec-493b-b21d-82ca3f857864" path="/var/lib/kubelet/pods/29031843-b2ec-493b-b21d-82ca3f857864/volumes" Oct 02 07:39:40 crc kubenswrapper[4786]: I1002 07:39:40.195781 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phht9\" (UniqueName: \"kubernetes.io/projected/fc0fa37b-06fe-48a5-aec3-9dd0a91922fa-kube-api-access-phht9\") pod \"crc-debug-bgsbq\" (UID: \"fc0fa37b-06fe-48a5-aec3-9dd0a91922fa\") " pod="openshift-must-gather-mp7t7/crc-debug-bgsbq" Oct 02 07:39:40 crc kubenswrapper[4786]: I1002 07:39:40.299274 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mp7t7/crc-debug-bgsbq" Oct 02 07:39:40 crc kubenswrapper[4786]: I1002 07:39:40.797474 4786 generic.go:334] "Generic (PLEG): container finished" podID="fc0fa37b-06fe-48a5-aec3-9dd0a91922fa" containerID="5ed81c1333b7e8fce111af184e2c5dae4178eaa7aa41fb04dea888dca8d862b8" exitCode=0 Oct 02 07:39:40 crc kubenswrapper[4786]: I1002 07:39:40.797558 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mp7t7/crc-debug-bgsbq" event={"ID":"fc0fa37b-06fe-48a5-aec3-9dd0a91922fa","Type":"ContainerDied","Data":"5ed81c1333b7e8fce111af184e2c5dae4178eaa7aa41fb04dea888dca8d862b8"} Oct 02 07:39:40 crc kubenswrapper[4786]: I1002 07:39:40.797673 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mp7t7/crc-debug-bgsbq" event={"ID":"fc0fa37b-06fe-48a5-aec3-9dd0a91922fa","Type":"ContainerStarted","Data":"b461448ff4c5772287c98be8ebebeb63085f8b55b6e78d70767007d37941021e"} Oct 02 07:39:41 crc kubenswrapper[4786]: I1002 07:39:41.871783 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mp7t7/crc-debug-bgsbq" Oct 02 07:39:41 crc kubenswrapper[4786]: I1002 07:39:41.899485 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phht9\" (UniqueName: \"kubernetes.io/projected/fc0fa37b-06fe-48a5-aec3-9dd0a91922fa-kube-api-access-phht9\") pod \"fc0fa37b-06fe-48a5-aec3-9dd0a91922fa\" (UID: \"fc0fa37b-06fe-48a5-aec3-9dd0a91922fa\") " Oct 02 07:39:41 crc kubenswrapper[4786]: I1002 07:39:41.899591 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc0fa37b-06fe-48a5-aec3-9dd0a91922fa-host\") pod \"fc0fa37b-06fe-48a5-aec3-9dd0a91922fa\" (UID: \"fc0fa37b-06fe-48a5-aec3-9dd0a91922fa\") " Oct 02 07:39:41 crc kubenswrapper[4786]: I1002 07:39:41.900187 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fc0fa37b-06fe-48a5-aec3-9dd0a91922fa-host" (OuterVolumeSpecName: "host") pod "fc0fa37b-06fe-48a5-aec3-9dd0a91922fa" (UID: "fc0fa37b-06fe-48a5-aec3-9dd0a91922fa"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 07:39:41 crc kubenswrapper[4786]: I1002 07:39:41.903573 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc0fa37b-06fe-48a5-aec3-9dd0a91922fa-kube-api-access-phht9" (OuterVolumeSpecName: "kube-api-access-phht9") pod "fc0fa37b-06fe-48a5-aec3-9dd0a91922fa" (UID: "fc0fa37b-06fe-48a5-aec3-9dd0a91922fa"). InnerVolumeSpecName "kube-api-access-phht9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:39:42 crc kubenswrapper[4786]: I1002 07:39:42.001087 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phht9\" (UniqueName: \"kubernetes.io/projected/fc0fa37b-06fe-48a5-aec3-9dd0a91922fa-kube-api-access-phht9\") on node \"crc\" DevicePath \"\"" Oct 02 07:39:42 crc kubenswrapper[4786]: I1002 07:39:42.001110 4786 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc0fa37b-06fe-48a5-aec3-9dd0a91922fa-host\") on node \"crc\" DevicePath \"\"" Oct 02 07:39:42 crc kubenswrapper[4786]: I1002 07:39:42.810766 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mp7t7/crc-debug-bgsbq" event={"ID":"fc0fa37b-06fe-48a5-aec3-9dd0a91922fa","Type":"ContainerDied","Data":"b461448ff4c5772287c98be8ebebeb63085f8b55b6e78d70767007d37941021e"} Oct 02 07:39:42 crc kubenswrapper[4786]: I1002 07:39:42.811020 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b461448ff4c5772287c98be8ebebeb63085f8b55b6e78d70767007d37941021e" Oct 02 07:39:42 crc kubenswrapper[4786]: I1002 07:39:42.810821 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mp7t7/crc-debug-bgsbq" Oct 02 07:39:45 crc kubenswrapper[4786]: I1002 07:39:45.849182 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mp7t7/crc-debug-bgsbq"] Oct 02 07:39:45 crc kubenswrapper[4786]: I1002 07:39:45.855407 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mp7t7/crc-debug-bgsbq"] Oct 02 07:39:46 crc kubenswrapper[4786]: I1002 07:39:46.187475 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc0fa37b-06fe-48a5-aec3-9dd0a91922fa" path="/var/lib/kubelet/pods/fc0fa37b-06fe-48a5-aec3-9dd0a91922fa/volumes" Oct 02 07:39:46 crc kubenswrapper[4786]: I1002 07:39:46.961495 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mp7t7/crc-debug-6hgx4"] Oct 02 07:39:46 crc kubenswrapper[4786]: E1002 07:39:46.961830 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc0fa37b-06fe-48a5-aec3-9dd0a91922fa" containerName="container-00" Oct 02 07:39:46 crc kubenswrapper[4786]: I1002 07:39:46.961842 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc0fa37b-06fe-48a5-aec3-9dd0a91922fa" containerName="container-00" Oct 02 07:39:46 crc kubenswrapper[4786]: I1002 07:39:46.962003 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc0fa37b-06fe-48a5-aec3-9dd0a91922fa" containerName="container-00" Oct 02 07:39:46 crc kubenswrapper[4786]: I1002 07:39:46.962505 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mp7t7/crc-debug-6hgx4" Oct 02 07:39:46 crc kubenswrapper[4786]: I1002 07:39:46.963858 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fec6fcb4-c0bb-4806-814a-184b8dfb7f9a-host\") pod \"crc-debug-6hgx4\" (UID: \"fec6fcb4-c0bb-4806-814a-184b8dfb7f9a\") " pod="openshift-must-gather-mp7t7/crc-debug-6hgx4" Oct 02 07:39:46 crc kubenswrapper[4786]: I1002 07:39:46.963915 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wqnn\" (UniqueName: \"kubernetes.io/projected/fec6fcb4-c0bb-4806-814a-184b8dfb7f9a-kube-api-access-5wqnn\") pod \"crc-debug-6hgx4\" (UID: \"fec6fcb4-c0bb-4806-814a-184b8dfb7f9a\") " pod="openshift-must-gather-mp7t7/crc-debug-6hgx4" Oct 02 07:39:46 crc kubenswrapper[4786]: I1002 07:39:46.963938 4786 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mp7t7"/"default-dockercfg-479mm" Oct 02 07:39:47 crc kubenswrapper[4786]: I1002 07:39:47.066268 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fec6fcb4-c0bb-4806-814a-184b8dfb7f9a-host\") pod \"crc-debug-6hgx4\" (UID: \"fec6fcb4-c0bb-4806-814a-184b8dfb7f9a\") " pod="openshift-must-gather-mp7t7/crc-debug-6hgx4" Oct 02 07:39:47 crc kubenswrapper[4786]: I1002 07:39:47.066335 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wqnn\" (UniqueName: \"kubernetes.io/projected/fec6fcb4-c0bb-4806-814a-184b8dfb7f9a-kube-api-access-5wqnn\") pod \"crc-debug-6hgx4\" (UID: \"fec6fcb4-c0bb-4806-814a-184b8dfb7f9a\") " pod="openshift-must-gather-mp7t7/crc-debug-6hgx4" Oct 02 07:39:47 crc kubenswrapper[4786]: I1002 07:39:47.066372 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fec6fcb4-c0bb-4806-814a-184b8dfb7f9a-host\") pod \"crc-debug-6hgx4\" (UID: \"fec6fcb4-c0bb-4806-814a-184b8dfb7f9a\") " pod="openshift-must-gather-mp7t7/crc-debug-6hgx4" Oct 02 07:39:47 crc kubenswrapper[4786]: I1002 07:39:47.081457 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wqnn\" (UniqueName: \"kubernetes.io/projected/fec6fcb4-c0bb-4806-814a-184b8dfb7f9a-kube-api-access-5wqnn\") pod \"crc-debug-6hgx4\" (UID: \"fec6fcb4-c0bb-4806-814a-184b8dfb7f9a\") " pod="openshift-must-gather-mp7t7/crc-debug-6hgx4" Oct 02 07:39:47 crc kubenswrapper[4786]: I1002 07:39:47.284607 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mp7t7/crc-debug-6hgx4" Oct 02 07:39:47 crc kubenswrapper[4786]: I1002 07:39:47.842447 4786 generic.go:334] "Generic (PLEG): container finished" podID="fec6fcb4-c0bb-4806-814a-184b8dfb7f9a" containerID="52e0a1f9b9104a43ee4efdb07148c7a63a93d7ba26d3b5b296d8fba34253c79f" exitCode=0 Oct 02 07:39:47 crc kubenswrapper[4786]: I1002 07:39:47.842535 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mp7t7/crc-debug-6hgx4" event={"ID":"fec6fcb4-c0bb-4806-814a-184b8dfb7f9a","Type":"ContainerDied","Data":"52e0a1f9b9104a43ee4efdb07148c7a63a93d7ba26d3b5b296d8fba34253c79f"} Oct 02 07:39:47 crc kubenswrapper[4786]: I1002 07:39:47.842671 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mp7t7/crc-debug-6hgx4" event={"ID":"fec6fcb4-c0bb-4806-814a-184b8dfb7f9a","Type":"ContainerStarted","Data":"55ad3061d18c33005fd784dee0eada4cae79f7ca20c4ae4537efffe93205e208"} Oct 02 07:39:47 crc kubenswrapper[4786]: I1002 07:39:47.872511 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mp7t7/crc-debug-6hgx4"] Oct 02 07:39:47 crc kubenswrapper[4786]: I1002 07:39:47.878349 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mp7t7/crc-debug-6hgx4"] Oct 02 07:39:48 crc kubenswrapper[4786]: I1002 07:39:48.916094 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n_78d8253b-165c-4e4d-9c1a-2a22c6828e08/util/0.log" Oct 02 07:39:48 crc kubenswrapper[4786]: I1002 07:39:48.932560 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mp7t7/crc-debug-6hgx4" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.034451 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n_78d8253b-165c-4e4d-9c1a-2a22c6828e08/pull/0.log" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.037363 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n_78d8253b-165c-4e4d-9c1a-2a22c6828e08/util/0.log" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.056497 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n_78d8253b-165c-4e4d-9c1a-2a22c6828e08/pull/0.log" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.096055 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wqnn\" (UniqueName: \"kubernetes.io/projected/fec6fcb4-c0bb-4806-814a-184b8dfb7f9a-kube-api-access-5wqnn\") pod \"fec6fcb4-c0bb-4806-814a-184b8dfb7f9a\" (UID: \"fec6fcb4-c0bb-4806-814a-184b8dfb7f9a\") " Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.096173 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fec6fcb4-c0bb-4806-814a-184b8dfb7f9a-host\") pod \"fec6fcb4-c0bb-4806-814a-184b8dfb7f9a\" (UID: \"fec6fcb4-c0bb-4806-814a-184b8dfb7f9a\") " Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.096378 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fec6fcb4-c0bb-4806-814a-184b8dfb7f9a-host" (OuterVolumeSpecName: "host") pod "fec6fcb4-c0bb-4806-814a-184b8dfb7f9a" (UID: "fec6fcb4-c0bb-4806-814a-184b8dfb7f9a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.096852 4786 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fec6fcb4-c0bb-4806-814a-184b8dfb7f9a-host\") on node \"crc\" DevicePath \"\"" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.113120 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fec6fcb4-c0bb-4806-814a-184b8dfb7f9a-kube-api-access-5wqnn" (OuterVolumeSpecName: "kube-api-access-5wqnn") pod "fec6fcb4-c0bb-4806-814a-184b8dfb7f9a" (UID: "fec6fcb4-c0bb-4806-814a-184b8dfb7f9a"). InnerVolumeSpecName "kube-api-access-5wqnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.179544 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:39:49 crc kubenswrapper[4786]: E1002 07:39:49.179890 4786 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-p6dmq_openshift-machine-config-operator(79cb22df-4930-4aed-9108-1056074d1000)\"" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.190680 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n_78d8253b-165c-4e4d-9c1a-2a22c6828e08/util/0.log" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.190787 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n_78d8253b-165c-4e4d-9c1a-2a22c6828e08/pull/0.log" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.198900 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wqnn\" (UniqueName: \"kubernetes.io/projected/fec6fcb4-c0bb-4806-814a-184b8dfb7f9a-kube-api-access-5wqnn\") on node \"crc\" DevicePath \"\"" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.212931 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0099f90962ae4d3cc6d942ef2ddab275b3922916e1b59044ce042d59e5dfw8n_78d8253b-165c-4e4d-9c1a-2a22c6828e08/extract/0.log" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.299627 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-f7f98cb69-dkqvh_0872cd5d-1fde-4b27-bdb7-6eade27cee9d/kube-rbac-proxy/0.log" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.362026 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-f7f98cb69-dkqvh_0872cd5d-1fde-4b27-bdb7-6eade27cee9d/manager/0.log" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.382387 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859cd486d-5b4mb_90cbbf62-df45-490f-a76d-6b24fdfe6aa7/kube-rbac-proxy/0.log" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.476581 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859cd486d-5b4mb_90cbbf62-df45-490f-a76d-6b24fdfe6aa7/manager/0.log" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.508364 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-77fb7bcf5b-4rqms_e229888c-d55e-4b2f-a53a-e588868f98e2/kube-rbac-proxy/0.log" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.516058 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-77fb7bcf5b-4rqms_e229888c-d55e-4b2f-a53a-e588868f98e2/manager/0.log" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.619086 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8bc4775b5-w6s6f_40d01a1d-0613-43eb-824b-24b22a879822/kube-rbac-proxy/0.log" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.674922 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8bc4775b5-w6s6f_40d01a1d-0613-43eb-824b-24b22a879822/manager/0.log" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.739218 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b4fc86755-xqp2m_458e49e3-37fa-4fde-b98e-35f6490ad3bc/kube-rbac-proxy/0.log" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.790995 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b4fc86755-xqp2m_458e49e3-37fa-4fde-b98e-35f6490ad3bc/manager/0.log" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.837477 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-679b4759bb-xcd8d_9c27556b-9c6e-4a2d-9c2d-78f471392a85/kube-rbac-proxy/0.log" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.869345 4786 scope.go:117] "RemoveContainer" containerID="52e0a1f9b9104a43ee4efdb07148c7a63a93d7ba26d3b5b296d8fba34253c79f" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.869366 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mp7t7/crc-debug-6hgx4" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.900745 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-679b4759bb-xcd8d_9c27556b-9c6e-4a2d-9c2d-78f471392a85/manager/0.log" Oct 02 07:39:49 crc kubenswrapper[4786]: I1002 07:39:49.994986 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5c8fdc4d5c-5vzvp_903d28be-eebf-4dd0-bd15-3f3e2a9416bf/kube-rbac-proxy/0.log" Oct 02 07:39:50 crc kubenswrapper[4786]: I1002 07:39:50.131286 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5c8fdc4d5c-5vzvp_903d28be-eebf-4dd0-bd15-3f3e2a9416bf/manager/0.log" Oct 02 07:39:50 crc kubenswrapper[4786]: I1002 07:39:50.145668 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f45cd594f-m99xq_8c50728d-dcc7-4cc1-a0fb-ef8c1355d6f9/kube-rbac-proxy/0.log" Oct 02 07:39:50 crc kubenswrapper[4786]: I1002 07:39:50.174390 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f45cd594f-m99xq_8c50728d-dcc7-4cc1-a0fb-ef8c1355d6f9/manager/0.log" Oct 02 07:39:50 crc kubenswrapper[4786]: I1002 07:39:50.186189 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fec6fcb4-c0bb-4806-814a-184b8dfb7f9a" path="/var/lib/kubelet/pods/fec6fcb4-c0bb-4806-814a-184b8dfb7f9a/volumes" Oct 02 07:39:50 crc kubenswrapper[4786]: I1002 07:39:50.283494 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-59d7dc95cf-jf2qh_bafce6df-ad32-4af9-9e38-d6da16305ee9/kube-rbac-proxy/0.log" Oct 02 07:39:50 crc kubenswrapper[4786]: I1002 07:39:50.336833 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-59d7dc95cf-jf2qh_bafce6df-ad32-4af9-9e38-d6da16305ee9/manager/0.log" Oct 02 07:39:50 crc kubenswrapper[4786]: I1002 07:39:50.376395 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-b7cf8cb5f-x6hzs_e49d1086-42df-449b-97b8-787edd49ba23/kube-rbac-proxy/0.log" Oct 02 07:39:50 crc kubenswrapper[4786]: I1002 07:39:50.460936 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-b7cf8cb5f-x6hzs_e49d1086-42df-449b-97b8-787edd49ba23/manager/0.log" Oct 02 07:39:50 crc kubenswrapper[4786]: I1002 07:39:50.501392 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf5bb885-7xngz_3d6bf83e-81f7-4c56-994f-1058c8ddbe74/kube-rbac-proxy/0.log" Oct 02 07:39:50 crc kubenswrapper[4786]: I1002 07:39:50.544196 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf5bb885-7xngz_3d6bf83e-81f7-4c56-994f-1058c8ddbe74/manager/0.log" Oct 02 07:39:50 crc kubenswrapper[4786]: I1002 07:39:50.636122 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54fbbfcd44-jpgjv_12f9e929-b1eb-4dd9-a686-0154f89b5dfc/kube-rbac-proxy/0.log" Oct 02 07:39:50 crc kubenswrapper[4786]: I1002 07:39:50.685540 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54fbbfcd44-jpgjv_12f9e929-b1eb-4dd9-a686-0154f89b5dfc/manager/0.log" Oct 02 07:39:50 crc kubenswrapper[4786]: I1002 07:39:50.767604 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7fd5b6bbc6-8cfbh_71792270-25e9-4028-b306-c235e6378802/kube-rbac-proxy/0.log" Oct 02 07:39:50 crc kubenswrapper[4786]: I1002 07:39:50.859675 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7fd5b6bbc6-8cfbh_71792270-25e9-4028-b306-c235e6378802/manager/0.log" Oct 02 07:39:50 crc kubenswrapper[4786]: I1002 07:39:50.887479 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-75f8d67d86-xv465_f496c1c2-326e-4429-b192-07a5ca33b28d/kube-rbac-proxy/0.log" Oct 02 07:39:50 crc kubenswrapper[4786]: I1002 07:39:50.933884 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-75f8d67d86-xv465_f496c1c2-326e-4429-b192-07a5ca33b28d/manager/0.log" Oct 02 07:39:51 crc kubenswrapper[4786]: I1002 07:39:51.041394 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-787874f5b7tcwtm_56dd0e8b-9081-4dbc-8683-a0da0f1be122/kube-rbac-proxy/0.log" Oct 02 07:39:51 crc kubenswrapper[4786]: I1002 07:39:51.087161 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-787874f5b7tcwtm_56dd0e8b-9081-4dbc-8683-a0da0f1be122/manager/0.log" Oct 02 07:39:51 crc kubenswrapper[4786]: I1002 07:39:51.185898 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-67698bcd47-cjhhk_8830c499-339f-4252-8f35-13671a0b6687/kube-rbac-proxy/0.log" Oct 02 07:39:51 crc kubenswrapper[4786]: I1002 07:39:51.335335 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-859455d779-ksrkf_f1ba6077-2c56-40f9-bcbc-64bae09186c6/kube-rbac-proxy/0.log" Oct 02 07:39:51 crc kubenswrapper[4786]: I1002 07:39:51.517102 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-859455d779-ksrkf_f1ba6077-2c56-40f9-bcbc-64bae09186c6/operator/0.log" Oct 02 07:39:51 crc kubenswrapper[4786]: I1002 07:39:51.564778 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-l4f42_d3794fc6-7388-43e3-bf57-545f152e19c4/registry-server/0.log" Oct 02 07:39:51 crc kubenswrapper[4786]: I1002 07:39:51.702760 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-84c745747f-p85rq_7aaa7a8a-2599-4f27-9bce-1fcee450fbfa/kube-rbac-proxy/0.log" Oct 02 07:39:51 crc kubenswrapper[4786]: I1002 07:39:51.751599 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-84c745747f-p85rq_7aaa7a8a-2599-4f27-9bce-1fcee450fbfa/manager/0.log" Oct 02 07:39:51 crc kubenswrapper[4786]: I1002 07:39:51.844229 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-598c4c8547-r2qcj_ed52f44e-1a43-4d07-a613-0d5d7c367a51/kube-rbac-proxy/0.log" Oct 02 07:39:51 crc kubenswrapper[4786]: I1002 07:39:51.907623 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-598c4c8547-r2qcj_ed52f44e-1a43-4d07-a613-0d5d7c367a51/manager/0.log" Oct 02 07:39:51 crc kubenswrapper[4786]: I1002 07:39:51.979196 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-rj9l5_4910bc83-927a-41bb-a4a6-834c09bcbc6e/operator/0.log" Oct 02 07:39:52 crc kubenswrapper[4786]: I1002 07:39:52.091572 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-689b4f76c9-gslcx_8e441bad-8657-4e25-a66b-caf4fc28ae8a/kube-rbac-proxy/0.log" Oct 02 07:39:52 crc kubenswrapper[4786]: I1002 07:39:52.128440 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-67698bcd47-cjhhk_8830c499-339f-4252-8f35-13671a0b6687/manager/0.log" Oct 02 07:39:52 crc kubenswrapper[4786]: I1002 07:39:52.130894 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-689b4f76c9-gslcx_8e441bad-8657-4e25-a66b-caf4fc28ae8a/manager/0.log" Oct 02 07:39:52 crc kubenswrapper[4786]: I1002 07:39:52.210188 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-cb66d6b59-89t2n_ae767b5f-e700-4253-be42-48f90b9fe99e/kube-rbac-proxy/0.log" Oct 02 07:39:52 crc kubenswrapper[4786]: I1002 07:39:52.287856 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-cb66d6b59-89t2n_ae767b5f-e700-4253-be42-48f90b9fe99e/manager/0.log" Oct 02 07:39:52 crc kubenswrapper[4786]: I1002 07:39:52.300714 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-cbdf6dc66-phjsn_fcbaee3e-a527-4eb0-9c2b-7ada804e7920/kube-rbac-proxy/0.log" Oct 02 07:39:52 crc kubenswrapper[4786]: I1002 07:39:52.329080 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-cbdf6dc66-phjsn_fcbaee3e-a527-4eb0-9c2b-7ada804e7920/manager/0.log" Oct 02 07:39:52 crc kubenswrapper[4786]: I1002 07:39:52.426669 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-68d7bc5569-6n59c_810706c5-80b3-47b8-8058-2b0aa1665942/kube-rbac-proxy/0.log" Oct 02 07:39:52 crc kubenswrapper[4786]: I1002 07:39:52.437654 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-68d7bc5569-6n59c_810706c5-80b3-47b8-8058-2b0aa1665942/manager/0.log" Oct 02 07:40:00 crc kubenswrapper[4786]: I1002 07:40:00.198791 4786 scope.go:117] "RemoveContainer" containerID="55d7fd4b4c2033149a220af5f6e79a90c246afe68bc436810f400db154a489c5" Oct 02 07:40:00 crc kubenswrapper[4786]: I1002 07:40:00.940792 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" event={"ID":"79cb22df-4930-4aed-9108-1056074d1000","Type":"ContainerStarted","Data":"04bdd0614e0ce091ac8c457fb0bc3a1088e726c4f75509d1ea73bb1c7afef0c3"} Oct 02 07:40:02 crc kubenswrapper[4786]: I1002 07:40:02.560756 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-xw7f8_24e3309e-db58-4fa4-a4f6-08fdb1ddb95c/control-plane-machine-set-operator/0.log" Oct 02 07:40:02 crc kubenswrapper[4786]: I1002 07:40:02.653542 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-97p7p_e0050175-139a-4210-add6-1b7bbe800f27/kube-rbac-proxy/0.log" Oct 02 07:40:02 crc kubenswrapper[4786]: I1002 07:40:02.676958 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-97p7p_e0050175-139a-4210-add6-1b7bbe800f27/machine-api-operator/0.log" Oct 02 07:40:10 crc kubenswrapper[4786]: I1002 07:40:10.567778 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-c7bxl_27ce8ca7-3393-43c6-ac0e-6e4128f84527/cert-manager-controller/0.log" Oct 02 07:40:10 crc kubenswrapper[4786]: I1002 07:40:10.667502 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-8tlx5_5286d0c8-0ce3-4d66-882f-9ffea6c90fa4/cert-manager-cainjector/0.log" Oct 02 07:40:10 crc kubenswrapper[4786]: I1002 07:40:10.732227 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-9wd5r_ca1f4166-adf4-4281-925f-224930e8f775/cert-manager-webhook/0.log" Oct 02 07:40:18 crc kubenswrapper[4786]: I1002 07:40:18.727280 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-z8sr6_fe88424e-e46c-46b3-a2e0-7bba5ef147b3/nmstate-console-plugin/0.log" Oct 02 07:40:18 crc kubenswrapper[4786]: I1002 07:40:18.861351 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-kgjxr_84995ef7-937a-442e-a016-f22a24d82882/kube-rbac-proxy/0.log" Oct 02 07:40:18 crc kubenswrapper[4786]: I1002 07:40:18.870105 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-kfbdc_9b208785-3928-4bc7-a6fd-1bcee5029917/nmstate-handler/0.log" Oct 02 07:40:18 crc kubenswrapper[4786]: I1002 07:40:18.920321 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-kgjxr_84995ef7-937a-442e-a016-f22a24d82882/nmstate-metrics/0.log" Oct 02 07:40:19 crc kubenswrapper[4786]: I1002 07:40:19.009609 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-pn52f_75a4981d-3614-4013-80e8-dcc8cd60da94/nmstate-operator/0.log" Oct 02 07:40:19 crc kubenswrapper[4786]: I1002 07:40:19.055341 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-6szkj_9376f046-2ecd-4c20-bf82-4f18490d91d9/nmstate-webhook/0.log" Oct 02 07:40:27 crc kubenswrapper[4786]: I1002 07:40:27.654971 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-4pdp4_defc1b9a-20a3-4272-af46-ad01ef957dba/kube-rbac-proxy/0.log" Oct 02 07:40:27 crc kubenswrapper[4786]: I1002 07:40:27.762452 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-4pdp4_defc1b9a-20a3-4272-af46-ad01ef957dba/controller/0.log" Oct 02 07:40:27 crc kubenswrapper[4786]: I1002 07:40:27.827816 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-frr-files/0.log" Oct 02 07:40:27 crc kubenswrapper[4786]: I1002 07:40:27.943018 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-reloader/0.log" Oct 02 07:40:27 crc kubenswrapper[4786]: I1002 07:40:27.966223 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-metrics/0.log" Oct 02 07:40:27 crc kubenswrapper[4786]: I1002 07:40:27.969975 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-reloader/0.log" Oct 02 07:40:27 crc kubenswrapper[4786]: I1002 07:40:27.970596 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-frr-files/0.log" Oct 02 07:40:28 crc kubenswrapper[4786]: I1002 07:40:28.091967 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-reloader/0.log" Oct 02 07:40:28 crc kubenswrapper[4786]: I1002 07:40:28.093240 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-frr-files/0.log" Oct 02 07:40:28 crc kubenswrapper[4786]: I1002 07:40:28.108938 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-metrics/0.log" Oct 02 07:40:28 crc kubenswrapper[4786]: I1002 07:40:28.121211 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-metrics/0.log" Oct 02 07:40:28 crc kubenswrapper[4786]: I1002 07:40:28.230075 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-metrics/0.log" Oct 02 07:40:28 crc kubenswrapper[4786]: I1002 07:40:28.234259 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-reloader/0.log" Oct 02 07:40:28 crc kubenswrapper[4786]: I1002 07:40:28.261444 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/cp-frr-files/0.log" Oct 02 07:40:28 crc kubenswrapper[4786]: I1002 07:40:28.283337 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/controller/0.log" Oct 02 07:40:28 crc kubenswrapper[4786]: I1002 07:40:28.376573 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/frr-metrics/0.log" Oct 02 07:40:28 crc kubenswrapper[4786]: I1002 07:40:28.408054 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/kube-rbac-proxy/0.log" Oct 02 07:40:28 crc kubenswrapper[4786]: I1002 07:40:28.443976 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/kube-rbac-proxy-frr/0.log" Oct 02 07:40:28 crc kubenswrapper[4786]: I1002 07:40:28.591504 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/reloader/0.log" Oct 02 07:40:28 crc kubenswrapper[4786]: I1002 07:40:28.606158 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-qspcx_bfefe2c3-8e13-45a3-b700-cda75a37345c/frr-k8s-webhook-server/0.log" Oct 02 07:40:28 crc kubenswrapper[4786]: I1002 07:40:28.809026 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-85846c6c54-9wx2x_42b1054f-e5f7-4d21-a4a8-98bcb85946c5/manager/0.log" Oct 02 07:40:28 crc kubenswrapper[4786]: I1002 07:40:28.894028 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6765d85898-4g7pf_b29b896b-61fb-40e7-80ce-d87fc031e3ae/webhook-server/0.log" Oct 02 07:40:28 crc kubenswrapper[4786]: I1002 07:40:28.993172 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6drsl_3b91fb80-e839-4d96-aa9a-4e08642aafe1/kube-rbac-proxy/0.log" Oct 02 07:40:29 crc kubenswrapper[4786]: I1002 07:40:29.380784 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6drsl_3b91fb80-e839-4d96-aa9a-4e08642aafe1/speaker/0.log" Oct 02 07:40:29 crc kubenswrapper[4786]: I1002 07:40:29.384331 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-lc2gd_0678c649-4b0b-4079-865a-7e85f6005a3d/frr/0.log" Oct 02 07:40:36 crc kubenswrapper[4786]: I1002 07:40:36.894790 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf_1bee82fb-73b9-40ee-800d-9b85d84324d6/util/0.log" Oct 02 07:40:37 crc kubenswrapper[4786]: I1002 07:40:37.025164 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf_1bee82fb-73b9-40ee-800d-9b85d84324d6/util/0.log" Oct 02 07:40:37 crc kubenswrapper[4786]: I1002 07:40:37.026907 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf_1bee82fb-73b9-40ee-800d-9b85d84324d6/pull/0.log" Oct 02 07:40:37 crc kubenswrapper[4786]: I1002 07:40:37.032259 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf_1bee82fb-73b9-40ee-800d-9b85d84324d6/pull/0.log" Oct 02 07:40:37 crc kubenswrapper[4786]: I1002 07:40:37.154337 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf_1bee82fb-73b9-40ee-800d-9b85d84324d6/util/0.log" Oct 02 07:40:37 crc kubenswrapper[4786]: I1002 07:40:37.161345 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf_1bee82fb-73b9-40ee-800d-9b85d84324d6/pull/0.log" Oct 02 07:40:37 crc kubenswrapper[4786]: I1002 07:40:37.179956 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2drrgf_1bee82fb-73b9-40ee-800d-9b85d84324d6/extract/0.log" Oct 02 07:40:37 crc kubenswrapper[4786]: I1002 07:40:37.256672 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqpjl_32e6a72b-b5cb-4538-8e54-0a48ad8b88a0/extract-utilities/0.log" Oct 02 07:40:37 crc kubenswrapper[4786]: I1002 07:40:37.378046 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqpjl_32e6a72b-b5cb-4538-8e54-0a48ad8b88a0/extract-utilities/0.log" Oct 02 07:40:37 crc kubenswrapper[4786]: I1002 07:40:37.385927 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqpjl_32e6a72b-b5cb-4538-8e54-0a48ad8b88a0/extract-content/0.log" Oct 02 07:40:37 crc kubenswrapper[4786]: I1002 07:40:37.416779 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqpjl_32e6a72b-b5cb-4538-8e54-0a48ad8b88a0/extract-content/0.log" Oct 02 07:40:37 crc kubenswrapper[4786]: I1002 07:40:37.543355 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqpjl_32e6a72b-b5cb-4538-8e54-0a48ad8b88a0/extract-content/0.log" Oct 02 07:40:37 crc kubenswrapper[4786]: I1002 07:40:37.551480 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqpjl_32e6a72b-b5cb-4538-8e54-0a48ad8b88a0/extract-utilities/0.log" Oct 02 07:40:37 crc kubenswrapper[4786]: I1002 07:40:37.740157 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lj274_2240fee8-3d4b-45e9-b7a1-242d5102f56e/extract-utilities/0.log" Oct 02 07:40:37 crc kubenswrapper[4786]: I1002 07:40:37.746275 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lqpjl_32e6a72b-b5cb-4538-8e54-0a48ad8b88a0/registry-server/0.log" Oct 02 07:40:37 crc kubenswrapper[4786]: I1002 07:40:37.833166 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lj274_2240fee8-3d4b-45e9-b7a1-242d5102f56e/extract-utilities/0.log" Oct 02 07:40:37 crc kubenswrapper[4786]: I1002 07:40:37.856311 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lj274_2240fee8-3d4b-45e9-b7a1-242d5102f56e/extract-content/0.log" Oct 02 07:40:37 crc kubenswrapper[4786]: I1002 07:40:37.860342 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lj274_2240fee8-3d4b-45e9-b7a1-242d5102f56e/extract-content/0.log" Oct 02 07:40:38 crc kubenswrapper[4786]: I1002 07:40:38.006171 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lj274_2240fee8-3d4b-45e9-b7a1-242d5102f56e/extract-utilities/0.log" Oct 02 07:40:38 crc kubenswrapper[4786]: I1002 07:40:38.026170 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lj274_2240fee8-3d4b-45e9-b7a1-242d5102f56e/extract-content/0.log" Oct 02 07:40:38 crc kubenswrapper[4786]: I1002 07:40:38.087185 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lj274_2240fee8-3d4b-45e9-b7a1-242d5102f56e/registry-server/0.log" Oct 02 07:40:38 crc kubenswrapper[4786]: I1002 07:40:38.230847 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_5490b488-b520-4906-92b6-b13a997075fb/util/0.log" Oct 02 07:40:38 crc kubenswrapper[4786]: I1002 07:40:38.344660 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_5490b488-b520-4906-92b6-b13a997075fb/pull/0.log" Oct 02 07:40:38 crc kubenswrapper[4786]: I1002 07:40:38.352398 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_5490b488-b520-4906-92b6-b13a997075fb/util/0.log" Oct 02 07:40:38 crc kubenswrapper[4786]: I1002 07:40:38.359187 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_5490b488-b520-4906-92b6-b13a997075fb/pull/0.log" Oct 02 07:40:38 crc kubenswrapper[4786]: I1002 07:40:38.465796 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_5490b488-b520-4906-92b6-b13a997075fb/util/0.log" Oct 02 07:40:38 crc kubenswrapper[4786]: I1002 07:40:38.469207 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_5490b488-b520-4906-92b6-b13a997075fb/extract/0.log" Oct 02 07:40:38 crc kubenswrapper[4786]: I1002 07:40:38.478177 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cjl85c_5490b488-b520-4906-92b6-b13a997075fb/pull/0.log" Oct 02 07:40:38 crc kubenswrapper[4786]: I1002 07:40:38.599955 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-tq2t2_53de3bf0-46ae-4969-a69e-2ad45e207407/marketplace-operator/0.log" Oct 02 07:40:38 crc kubenswrapper[4786]: I1002 07:40:38.622063 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rtbzr_cdc7bd8b-8dab-48d7-a338-a6eb79d14c13/extract-utilities/0.log" Oct 02 07:40:38 crc kubenswrapper[4786]: I1002 07:40:38.735273 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rtbzr_cdc7bd8b-8dab-48d7-a338-a6eb79d14c13/extract-content/0.log" Oct 02 07:40:38 crc kubenswrapper[4786]: I1002 07:40:38.737405 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rtbzr_cdc7bd8b-8dab-48d7-a338-a6eb79d14c13/extract-utilities/0.log" Oct 02 07:40:38 crc kubenswrapper[4786]: I1002 07:40:38.737989 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rtbzr_cdc7bd8b-8dab-48d7-a338-a6eb79d14c13/extract-content/0.log" Oct 02 07:40:38 crc kubenswrapper[4786]: I1002 07:40:38.894852 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rtbzr_cdc7bd8b-8dab-48d7-a338-a6eb79d14c13/extract-utilities/0.log" Oct 02 07:40:38 crc kubenswrapper[4786]: I1002 07:40:38.914902 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rtbzr_cdc7bd8b-8dab-48d7-a338-a6eb79d14c13/extract-content/0.log" Oct 02 07:40:38 crc kubenswrapper[4786]: I1002 07:40:38.978540 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rtbzr_cdc7bd8b-8dab-48d7-a338-a6eb79d14c13/registry-server/0.log" Oct 02 07:40:39 crc kubenswrapper[4786]: I1002 07:40:39.050440 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qmvfz_88d3da98-14ac-4121-af09-590caee1d21e/extract-utilities/0.log" Oct 02 07:40:39 crc kubenswrapper[4786]: I1002 07:40:39.169406 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qmvfz_88d3da98-14ac-4121-af09-590caee1d21e/extract-utilities/0.log" Oct 02 07:40:39 crc kubenswrapper[4786]: I1002 07:40:39.172935 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qmvfz_88d3da98-14ac-4121-af09-590caee1d21e/extract-content/0.log" Oct 02 07:40:39 crc kubenswrapper[4786]: I1002 07:40:39.176393 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qmvfz_88d3da98-14ac-4121-af09-590caee1d21e/extract-content/0.log" Oct 02 07:40:39 crc kubenswrapper[4786]: I1002 07:40:39.302897 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qmvfz_88d3da98-14ac-4121-af09-590caee1d21e/extract-utilities/0.log" Oct 02 07:40:39 crc kubenswrapper[4786]: I1002 07:40:39.306684 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qmvfz_88d3da98-14ac-4121-af09-590caee1d21e/extract-content/0.log" Oct 02 07:40:39 crc kubenswrapper[4786]: I1002 07:40:39.655068 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qmvfz_88d3da98-14ac-4121-af09-590caee1d21e/registry-server/0.log" Oct 02 07:41:34 crc kubenswrapper[4786]: I1002 07:41:34.314962 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fqcjv"] Oct 02 07:41:34 crc kubenswrapper[4786]: E1002 07:41:34.315659 4786 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fec6fcb4-c0bb-4806-814a-184b8dfb7f9a" containerName="container-00" Oct 02 07:41:34 crc kubenswrapper[4786]: I1002 07:41:34.315672 4786 state_mem.go:107] "Deleted CPUSet assignment" podUID="fec6fcb4-c0bb-4806-814a-184b8dfb7f9a" containerName="container-00" Oct 02 07:41:34 crc kubenswrapper[4786]: I1002 07:41:34.315892 4786 memory_manager.go:354] "RemoveStaleState removing state" podUID="fec6fcb4-c0bb-4806-814a-184b8dfb7f9a" containerName="container-00" Oct 02 07:41:34 crc kubenswrapper[4786]: I1002 07:41:34.317062 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fqcjv" Oct 02 07:41:34 crc kubenswrapper[4786]: I1002 07:41:34.321078 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqcjv"] Oct 02 07:41:34 crc kubenswrapper[4786]: I1002 07:41:34.342169 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1fd79bf-d95c-4dec-acc2-010ae2ef078a-utilities\") pod \"redhat-marketplace-fqcjv\" (UID: \"a1fd79bf-d95c-4dec-acc2-010ae2ef078a\") " pod="openshift-marketplace/redhat-marketplace-fqcjv" Oct 02 07:41:34 crc kubenswrapper[4786]: I1002 07:41:34.342332 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm9ns\" (UniqueName: \"kubernetes.io/projected/a1fd79bf-d95c-4dec-acc2-010ae2ef078a-kube-api-access-hm9ns\") pod \"redhat-marketplace-fqcjv\" (UID: \"a1fd79bf-d95c-4dec-acc2-010ae2ef078a\") " pod="openshift-marketplace/redhat-marketplace-fqcjv" Oct 02 07:41:34 crc kubenswrapper[4786]: I1002 07:41:34.342461 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1fd79bf-d95c-4dec-acc2-010ae2ef078a-catalog-content\") pod \"redhat-marketplace-fqcjv\" (UID: \"a1fd79bf-d95c-4dec-acc2-010ae2ef078a\") " pod="openshift-marketplace/redhat-marketplace-fqcjv" Oct 02 07:41:34 crc kubenswrapper[4786]: I1002 07:41:34.443705 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm9ns\" (UniqueName: \"kubernetes.io/projected/a1fd79bf-d95c-4dec-acc2-010ae2ef078a-kube-api-access-hm9ns\") pod \"redhat-marketplace-fqcjv\" (UID: \"a1fd79bf-d95c-4dec-acc2-010ae2ef078a\") " pod="openshift-marketplace/redhat-marketplace-fqcjv" Oct 02 07:41:34 crc kubenswrapper[4786]: I1002 07:41:34.443951 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1fd79bf-d95c-4dec-acc2-010ae2ef078a-catalog-content\") pod \"redhat-marketplace-fqcjv\" (UID: \"a1fd79bf-d95c-4dec-acc2-010ae2ef078a\") " pod="openshift-marketplace/redhat-marketplace-fqcjv" Oct 02 07:41:34 crc kubenswrapper[4786]: I1002 07:41:34.443994 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1fd79bf-d95c-4dec-acc2-010ae2ef078a-utilities\") pod \"redhat-marketplace-fqcjv\" (UID: \"a1fd79bf-d95c-4dec-acc2-010ae2ef078a\") " pod="openshift-marketplace/redhat-marketplace-fqcjv" Oct 02 07:41:34 crc kubenswrapper[4786]: I1002 07:41:34.444344 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1fd79bf-d95c-4dec-acc2-010ae2ef078a-utilities\") pod \"redhat-marketplace-fqcjv\" (UID: \"a1fd79bf-d95c-4dec-acc2-010ae2ef078a\") " pod="openshift-marketplace/redhat-marketplace-fqcjv" Oct 02 07:41:34 crc kubenswrapper[4786]: I1002 07:41:34.444793 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1fd79bf-d95c-4dec-acc2-010ae2ef078a-catalog-content\") pod \"redhat-marketplace-fqcjv\" (UID: \"a1fd79bf-d95c-4dec-acc2-010ae2ef078a\") " pod="openshift-marketplace/redhat-marketplace-fqcjv" Oct 02 07:41:34 crc kubenswrapper[4786]: I1002 07:41:34.458765 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm9ns\" (UniqueName: \"kubernetes.io/projected/a1fd79bf-d95c-4dec-acc2-010ae2ef078a-kube-api-access-hm9ns\") pod \"redhat-marketplace-fqcjv\" (UID: \"a1fd79bf-d95c-4dec-acc2-010ae2ef078a\") " pod="openshift-marketplace/redhat-marketplace-fqcjv" Oct 02 07:41:34 crc kubenswrapper[4786]: I1002 07:41:34.639300 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fqcjv" Oct 02 07:41:35 crc kubenswrapper[4786]: I1002 07:41:35.001897 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqcjv"] Oct 02 07:41:35 crc kubenswrapper[4786]: I1002 07:41:35.542550 4786 generic.go:334] "Generic (PLEG): container finished" podID="a1fd79bf-d95c-4dec-acc2-010ae2ef078a" containerID="0fc5a2338afdac9b37e235d3ae782da583149e1d5d7d1fff0b0d1db52597e0f9" exitCode=0 Oct 02 07:41:35 crc kubenswrapper[4786]: I1002 07:41:35.542598 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqcjv" event={"ID":"a1fd79bf-d95c-4dec-acc2-010ae2ef078a","Type":"ContainerDied","Data":"0fc5a2338afdac9b37e235d3ae782da583149e1d5d7d1fff0b0d1db52597e0f9"} Oct 02 07:41:35 crc kubenswrapper[4786]: I1002 07:41:35.542827 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqcjv" event={"ID":"a1fd79bf-d95c-4dec-acc2-010ae2ef078a","Type":"ContainerStarted","Data":"f000eb5a542214d96d309d107a8b306751e39a64a7a69a2d1b15596fbce670a1"} Oct 02 07:41:35 crc kubenswrapper[4786]: I1002 07:41:35.544510 4786 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 07:41:35 crc kubenswrapper[4786]: I1002 07:41:35.712849 4786 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j9tt7"] Oct 02 07:41:35 crc kubenswrapper[4786]: I1002 07:41:35.714582 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j9tt7" Oct 02 07:41:35 crc kubenswrapper[4786]: I1002 07:41:35.723121 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j9tt7"] Oct 02 07:41:35 crc kubenswrapper[4786]: I1002 07:41:35.764373 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c96198-9b42-44d5-aac8-ff62f531ffa8-catalog-content\") pod \"redhat-operators-j9tt7\" (UID: \"71c96198-9b42-44d5-aac8-ff62f531ffa8\") " pod="openshift-marketplace/redhat-operators-j9tt7" Oct 02 07:41:35 crc kubenswrapper[4786]: I1002 07:41:35.764544 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpkbt\" (UniqueName: \"kubernetes.io/projected/71c96198-9b42-44d5-aac8-ff62f531ffa8-kube-api-access-mpkbt\") pod \"redhat-operators-j9tt7\" (UID: \"71c96198-9b42-44d5-aac8-ff62f531ffa8\") " pod="openshift-marketplace/redhat-operators-j9tt7" Oct 02 07:41:35 crc kubenswrapper[4786]: I1002 07:41:35.764629 4786 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c96198-9b42-44d5-aac8-ff62f531ffa8-utilities\") pod \"redhat-operators-j9tt7\" (UID: \"71c96198-9b42-44d5-aac8-ff62f531ffa8\") " pod="openshift-marketplace/redhat-operators-j9tt7" Oct 02 07:41:35 crc kubenswrapper[4786]: I1002 07:41:35.865820 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c96198-9b42-44d5-aac8-ff62f531ffa8-catalog-content\") pod \"redhat-operators-j9tt7\" (UID: \"71c96198-9b42-44d5-aac8-ff62f531ffa8\") " pod="openshift-marketplace/redhat-operators-j9tt7" Oct 02 07:41:35 crc kubenswrapper[4786]: I1002 07:41:35.865951 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpkbt\" (UniqueName: \"kubernetes.io/projected/71c96198-9b42-44d5-aac8-ff62f531ffa8-kube-api-access-mpkbt\") pod \"redhat-operators-j9tt7\" (UID: \"71c96198-9b42-44d5-aac8-ff62f531ffa8\") " pod="openshift-marketplace/redhat-operators-j9tt7" Oct 02 07:41:35 crc kubenswrapper[4786]: I1002 07:41:35.866037 4786 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c96198-9b42-44d5-aac8-ff62f531ffa8-utilities\") pod \"redhat-operators-j9tt7\" (UID: \"71c96198-9b42-44d5-aac8-ff62f531ffa8\") " pod="openshift-marketplace/redhat-operators-j9tt7" Oct 02 07:41:35 crc kubenswrapper[4786]: I1002 07:41:35.866773 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c96198-9b42-44d5-aac8-ff62f531ffa8-utilities\") pod \"redhat-operators-j9tt7\" (UID: \"71c96198-9b42-44d5-aac8-ff62f531ffa8\") " pod="openshift-marketplace/redhat-operators-j9tt7" Oct 02 07:41:35 crc kubenswrapper[4786]: I1002 07:41:35.867573 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c96198-9b42-44d5-aac8-ff62f531ffa8-catalog-content\") pod \"redhat-operators-j9tt7\" (UID: \"71c96198-9b42-44d5-aac8-ff62f531ffa8\") " pod="openshift-marketplace/redhat-operators-j9tt7" Oct 02 07:41:35 crc kubenswrapper[4786]: I1002 07:41:35.883150 4786 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpkbt\" (UniqueName: \"kubernetes.io/projected/71c96198-9b42-44d5-aac8-ff62f531ffa8-kube-api-access-mpkbt\") pod \"redhat-operators-j9tt7\" (UID: \"71c96198-9b42-44d5-aac8-ff62f531ffa8\") " pod="openshift-marketplace/redhat-operators-j9tt7" Oct 02 07:41:36 crc kubenswrapper[4786]: I1002 07:41:36.037303 4786 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j9tt7" Oct 02 07:41:36 crc kubenswrapper[4786]: I1002 07:41:36.423180 4786 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j9tt7"] Oct 02 07:41:36 crc kubenswrapper[4786]: W1002 07:41:36.433650 4786 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71c96198_9b42_44d5_aac8_ff62f531ffa8.slice/crio-5ba0bf6245093dd3db42bdd3daea0743f3dfb02dcdaf4e17b97f3ada318d5040 WatchSource:0}: Error finding container 5ba0bf6245093dd3db42bdd3daea0743f3dfb02dcdaf4e17b97f3ada318d5040: Status 404 returned error can't find the container with id 5ba0bf6245093dd3db42bdd3daea0743f3dfb02dcdaf4e17b97f3ada318d5040 Oct 02 07:41:36 crc kubenswrapper[4786]: I1002 07:41:36.559908 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9tt7" event={"ID":"71c96198-9b42-44d5-aac8-ff62f531ffa8","Type":"ContainerStarted","Data":"5ba0bf6245093dd3db42bdd3daea0743f3dfb02dcdaf4e17b97f3ada318d5040"} Oct 02 07:41:36 crc kubenswrapper[4786]: I1002 07:41:36.567201 4786 generic.go:334] "Generic (PLEG): container finished" podID="a1fd79bf-d95c-4dec-acc2-010ae2ef078a" containerID="49be32d50e4a0210e73b1413a9a501e0df0c5b0b8ca300e4c34f6a78566c3c19" exitCode=0 Oct 02 07:41:36 crc kubenswrapper[4786]: I1002 07:41:36.567237 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqcjv" event={"ID":"a1fd79bf-d95c-4dec-acc2-010ae2ef078a","Type":"ContainerDied","Data":"49be32d50e4a0210e73b1413a9a501e0df0c5b0b8ca300e4c34f6a78566c3c19"} Oct 02 07:41:37 crc kubenswrapper[4786]: I1002 07:41:37.574562 4786 generic.go:334] "Generic (PLEG): container finished" podID="71c96198-9b42-44d5-aac8-ff62f531ffa8" containerID="f57f539d3be9e3ca38f5d784f97fac51de9631d5b2111e5c8267d207315e621a" exitCode=0 Oct 02 07:41:37 crc kubenswrapper[4786]: I1002 07:41:37.574747 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9tt7" event={"ID":"71c96198-9b42-44d5-aac8-ff62f531ffa8","Type":"ContainerDied","Data":"f57f539d3be9e3ca38f5d784f97fac51de9631d5b2111e5c8267d207315e621a"} Oct 02 07:41:37 crc kubenswrapper[4786]: I1002 07:41:37.576964 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqcjv" event={"ID":"a1fd79bf-d95c-4dec-acc2-010ae2ef078a","Type":"ContainerStarted","Data":"640b36dea5625df9ba6aa6faae9b136208081ec29c57e544fd7dbd6854a25de3"} Oct 02 07:41:37 crc kubenswrapper[4786]: I1002 07:41:37.600124 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fqcjv" podStartSLOduration=2.045419239 podStartE2EDuration="3.600113068s" podCreationTimestamp="2025-10-02 07:41:34 +0000 UTC" firstStartedPulling="2025-10-02 07:41:35.544318079 +0000 UTC m=+3305.665501210" lastFinishedPulling="2025-10-02 07:41:37.099011917 +0000 UTC m=+3307.220195039" observedRunningTime="2025-10-02 07:41:37.598352558 +0000 UTC m=+3307.719535699" watchObservedRunningTime="2025-10-02 07:41:37.600113068 +0000 UTC m=+3307.721296198" Oct 02 07:41:38 crc kubenswrapper[4786]: I1002 07:41:38.584871 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9tt7" event={"ID":"71c96198-9b42-44d5-aac8-ff62f531ffa8","Type":"ContainerStarted","Data":"b34364ca4e6b3523c62b34c79ab9514503c42e2b75e944085ed93246bbb2c603"} Oct 02 07:41:39 crc kubenswrapper[4786]: I1002 07:41:39.593639 4786 generic.go:334] "Generic (PLEG): container finished" podID="71c96198-9b42-44d5-aac8-ff62f531ffa8" containerID="b34364ca4e6b3523c62b34c79ab9514503c42e2b75e944085ed93246bbb2c603" exitCode=0 Oct 02 07:41:39 crc kubenswrapper[4786]: I1002 07:41:39.593676 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9tt7" event={"ID":"71c96198-9b42-44d5-aac8-ff62f531ffa8","Type":"ContainerDied","Data":"b34364ca4e6b3523c62b34c79ab9514503c42e2b75e944085ed93246bbb2c603"} Oct 02 07:41:39 crc kubenswrapper[4786]: I1002 07:41:39.826100 4786 scope.go:117] "RemoveContainer" containerID="4d1074959cdbca1af8404b617dc346ba1e9b59eb4653bb50fc696c3c3e1355ed" Oct 02 07:41:40 crc kubenswrapper[4786]: I1002 07:41:40.601046 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9tt7" event={"ID":"71c96198-9b42-44d5-aac8-ff62f531ffa8","Type":"ContainerStarted","Data":"aa403355bcda7f16309a9c0c32f47f6f6a597753c62a68748930dd8f9db83b5d"} Oct 02 07:41:40 crc kubenswrapper[4786]: I1002 07:41:40.619802 4786 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j9tt7" podStartSLOduration=3.045237409 podStartE2EDuration="5.619788576s" podCreationTimestamp="2025-10-02 07:41:35 +0000 UTC" firstStartedPulling="2025-10-02 07:41:37.576081294 +0000 UTC m=+3307.697264425" lastFinishedPulling="2025-10-02 07:41:40.150632461 +0000 UTC m=+3310.271815592" observedRunningTime="2025-10-02 07:41:40.613726365 +0000 UTC m=+3310.734909506" watchObservedRunningTime="2025-10-02 07:41:40.619788576 +0000 UTC m=+3310.740971707" Oct 02 07:41:44 crc kubenswrapper[4786]: I1002 07:41:44.640206 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fqcjv" Oct 02 07:41:44 crc kubenswrapper[4786]: I1002 07:41:44.640968 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fqcjv" Oct 02 07:41:44 crc kubenswrapper[4786]: I1002 07:41:44.670429 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fqcjv" Oct 02 07:41:45 crc kubenswrapper[4786]: I1002 07:41:45.664094 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fqcjv" Oct 02 07:41:45 crc kubenswrapper[4786]: I1002 07:41:45.698006 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqcjv"] Oct 02 07:41:46 crc kubenswrapper[4786]: I1002 07:41:46.037784 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j9tt7" Oct 02 07:41:46 crc kubenswrapper[4786]: I1002 07:41:46.038363 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j9tt7" Oct 02 07:41:46 crc kubenswrapper[4786]: I1002 07:41:46.069803 4786 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j9tt7" Oct 02 07:41:46 crc kubenswrapper[4786]: I1002 07:41:46.682751 4786 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j9tt7" Oct 02 07:41:47 crc kubenswrapper[4786]: I1002 07:41:47.295550 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j9tt7"] Oct 02 07:41:47 crc kubenswrapper[4786]: I1002 07:41:47.654525 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fqcjv" podUID="a1fd79bf-d95c-4dec-acc2-010ae2ef078a" containerName="registry-server" containerID="cri-o://640b36dea5625df9ba6aa6faae9b136208081ec29c57e544fd7dbd6854a25de3" gracePeriod=2 Oct 02 07:41:47 crc kubenswrapper[4786]: I1002 07:41:47.996347 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fqcjv" Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.043929 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1fd79bf-d95c-4dec-acc2-010ae2ef078a-utilities\") pod \"a1fd79bf-d95c-4dec-acc2-010ae2ef078a\" (UID: \"a1fd79bf-d95c-4dec-acc2-010ae2ef078a\") " Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.044004 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm9ns\" (UniqueName: \"kubernetes.io/projected/a1fd79bf-d95c-4dec-acc2-010ae2ef078a-kube-api-access-hm9ns\") pod \"a1fd79bf-d95c-4dec-acc2-010ae2ef078a\" (UID: \"a1fd79bf-d95c-4dec-acc2-010ae2ef078a\") " Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.044028 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1fd79bf-d95c-4dec-acc2-010ae2ef078a-catalog-content\") pod \"a1fd79bf-d95c-4dec-acc2-010ae2ef078a\" (UID: \"a1fd79bf-d95c-4dec-acc2-010ae2ef078a\") " Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.057937 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1fd79bf-d95c-4dec-acc2-010ae2ef078a-utilities" (OuterVolumeSpecName: "utilities") pod "a1fd79bf-d95c-4dec-acc2-010ae2ef078a" (UID: "a1fd79bf-d95c-4dec-acc2-010ae2ef078a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.058079 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1fd79bf-d95c-4dec-acc2-010ae2ef078a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1fd79bf-d95c-4dec-acc2-010ae2ef078a" (UID: "a1fd79bf-d95c-4dec-acc2-010ae2ef078a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.061825 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1fd79bf-d95c-4dec-acc2-010ae2ef078a-kube-api-access-hm9ns" (OuterVolumeSpecName: "kube-api-access-hm9ns") pod "a1fd79bf-d95c-4dec-acc2-010ae2ef078a" (UID: "a1fd79bf-d95c-4dec-acc2-010ae2ef078a"). InnerVolumeSpecName "kube-api-access-hm9ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.145339 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1fd79bf-d95c-4dec-acc2-010ae2ef078a-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.145362 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm9ns\" (UniqueName: \"kubernetes.io/projected/a1fd79bf-d95c-4dec-acc2-010ae2ef078a-kube-api-access-hm9ns\") on node \"crc\" DevicePath \"\"" Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.145371 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1fd79bf-d95c-4dec-acc2-010ae2ef078a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.662168 4786 generic.go:334] "Generic (PLEG): container finished" podID="a1fd79bf-d95c-4dec-acc2-010ae2ef078a" containerID="640b36dea5625df9ba6aa6faae9b136208081ec29c57e544fd7dbd6854a25de3" exitCode=0 Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.662359 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqcjv" event={"ID":"a1fd79bf-d95c-4dec-acc2-010ae2ef078a","Type":"ContainerDied","Data":"640b36dea5625df9ba6aa6faae9b136208081ec29c57e544fd7dbd6854a25de3"} Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.662398 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqcjv" event={"ID":"a1fd79bf-d95c-4dec-acc2-010ae2ef078a","Type":"ContainerDied","Data":"f000eb5a542214d96d309d107a8b306751e39a64a7a69a2d1b15596fbce670a1"} Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.662433 4786 scope.go:117] "RemoveContainer" containerID="640b36dea5625df9ba6aa6faae9b136208081ec29c57e544fd7dbd6854a25de3" Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.662479 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j9tt7" podUID="71c96198-9b42-44d5-aac8-ff62f531ffa8" containerName="registry-server" containerID="cri-o://aa403355bcda7f16309a9c0c32f47f6f6a597753c62a68748930dd8f9db83b5d" gracePeriod=2 Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.662564 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fqcjv" Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.687790 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqcjv"] Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.690013 4786 scope.go:117] "RemoveContainer" containerID="49be32d50e4a0210e73b1413a9a501e0df0c5b0b8ca300e4c34f6a78566c3c19" Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.696097 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqcjv"] Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.705947 4786 scope.go:117] "RemoveContainer" containerID="0fc5a2338afdac9b37e235d3ae782da583149e1d5d7d1fff0b0d1db52597e0f9" Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.827093 4786 scope.go:117] "RemoveContainer" containerID="640b36dea5625df9ba6aa6faae9b136208081ec29c57e544fd7dbd6854a25de3" Oct 02 07:41:48 crc kubenswrapper[4786]: E1002 07:41:48.827907 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"640b36dea5625df9ba6aa6faae9b136208081ec29c57e544fd7dbd6854a25de3\": container with ID starting with 640b36dea5625df9ba6aa6faae9b136208081ec29c57e544fd7dbd6854a25de3 not found: ID does not exist" containerID="640b36dea5625df9ba6aa6faae9b136208081ec29c57e544fd7dbd6854a25de3" Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.827937 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"640b36dea5625df9ba6aa6faae9b136208081ec29c57e544fd7dbd6854a25de3"} err="failed to get container status \"640b36dea5625df9ba6aa6faae9b136208081ec29c57e544fd7dbd6854a25de3\": rpc error: code = NotFound desc = could not find container \"640b36dea5625df9ba6aa6faae9b136208081ec29c57e544fd7dbd6854a25de3\": container with ID starting with 640b36dea5625df9ba6aa6faae9b136208081ec29c57e544fd7dbd6854a25de3 not found: ID does not exist" Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.827956 4786 scope.go:117] "RemoveContainer" containerID="49be32d50e4a0210e73b1413a9a501e0df0c5b0b8ca300e4c34f6a78566c3c19" Oct 02 07:41:48 crc kubenswrapper[4786]: E1002 07:41:48.828551 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49be32d50e4a0210e73b1413a9a501e0df0c5b0b8ca300e4c34f6a78566c3c19\": container with ID starting with 49be32d50e4a0210e73b1413a9a501e0df0c5b0b8ca300e4c34f6a78566c3c19 not found: ID does not exist" containerID="49be32d50e4a0210e73b1413a9a501e0df0c5b0b8ca300e4c34f6a78566c3c19" Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.828590 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49be32d50e4a0210e73b1413a9a501e0df0c5b0b8ca300e4c34f6a78566c3c19"} err="failed to get container status \"49be32d50e4a0210e73b1413a9a501e0df0c5b0b8ca300e4c34f6a78566c3c19\": rpc error: code = NotFound desc = could not find container \"49be32d50e4a0210e73b1413a9a501e0df0c5b0b8ca300e4c34f6a78566c3c19\": container with ID starting with 49be32d50e4a0210e73b1413a9a501e0df0c5b0b8ca300e4c34f6a78566c3c19 not found: ID does not exist" Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.828615 4786 scope.go:117] "RemoveContainer" containerID="0fc5a2338afdac9b37e235d3ae782da583149e1d5d7d1fff0b0d1db52597e0f9" Oct 02 07:41:48 crc kubenswrapper[4786]: E1002 07:41:48.829127 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fc5a2338afdac9b37e235d3ae782da583149e1d5d7d1fff0b0d1db52597e0f9\": container with ID starting with 0fc5a2338afdac9b37e235d3ae782da583149e1d5d7d1fff0b0d1db52597e0f9 not found: ID does not exist" containerID="0fc5a2338afdac9b37e235d3ae782da583149e1d5d7d1fff0b0d1db52597e0f9" Oct 02 07:41:48 crc kubenswrapper[4786]: I1002 07:41:48.829169 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fc5a2338afdac9b37e235d3ae782da583149e1d5d7d1fff0b0d1db52597e0f9"} err="failed to get container status \"0fc5a2338afdac9b37e235d3ae782da583149e1d5d7d1fff0b0d1db52597e0f9\": rpc error: code = NotFound desc = could not find container \"0fc5a2338afdac9b37e235d3ae782da583149e1d5d7d1fff0b0d1db52597e0f9\": container with ID starting with 0fc5a2338afdac9b37e235d3ae782da583149e1d5d7d1fff0b0d1db52597e0f9 not found: ID does not exist" Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.004329 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j9tt7" Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.065020 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c96198-9b42-44d5-aac8-ff62f531ffa8-catalog-content\") pod \"71c96198-9b42-44d5-aac8-ff62f531ffa8\" (UID: \"71c96198-9b42-44d5-aac8-ff62f531ffa8\") " Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.065121 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c96198-9b42-44d5-aac8-ff62f531ffa8-utilities\") pod \"71c96198-9b42-44d5-aac8-ff62f531ffa8\" (UID: \"71c96198-9b42-44d5-aac8-ff62f531ffa8\") " Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.065141 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpkbt\" (UniqueName: \"kubernetes.io/projected/71c96198-9b42-44d5-aac8-ff62f531ffa8-kube-api-access-mpkbt\") pod \"71c96198-9b42-44d5-aac8-ff62f531ffa8\" (UID: \"71c96198-9b42-44d5-aac8-ff62f531ffa8\") " Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.066752 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71c96198-9b42-44d5-aac8-ff62f531ffa8-utilities" (OuterVolumeSpecName: "utilities") pod "71c96198-9b42-44d5-aac8-ff62f531ffa8" (UID: "71c96198-9b42-44d5-aac8-ff62f531ffa8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.078595 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c96198-9b42-44d5-aac8-ff62f531ffa8-kube-api-access-mpkbt" (OuterVolumeSpecName: "kube-api-access-mpkbt") pod "71c96198-9b42-44d5-aac8-ff62f531ffa8" (UID: "71c96198-9b42-44d5-aac8-ff62f531ffa8"). InnerVolumeSpecName "kube-api-access-mpkbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.166751 4786 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c96198-9b42-44d5-aac8-ff62f531ffa8-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.166775 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpkbt\" (UniqueName: \"kubernetes.io/projected/71c96198-9b42-44d5-aac8-ff62f531ffa8-kube-api-access-mpkbt\") on node \"crc\" DevicePath \"\"" Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.671473 4786 generic.go:334] "Generic (PLEG): container finished" podID="71c96198-9b42-44d5-aac8-ff62f531ffa8" containerID="aa403355bcda7f16309a9c0c32f47f6f6a597753c62a68748930dd8f9db83b5d" exitCode=0 Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.671619 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9tt7" event={"ID":"71c96198-9b42-44d5-aac8-ff62f531ffa8","Type":"ContainerDied","Data":"aa403355bcda7f16309a9c0c32f47f6f6a597753c62a68748930dd8f9db83b5d"} Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.671664 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9tt7" event={"ID":"71c96198-9b42-44d5-aac8-ff62f531ffa8","Type":"ContainerDied","Data":"5ba0bf6245093dd3db42bdd3daea0743f3dfb02dcdaf4e17b97f3ada318d5040"} Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.671675 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j9tt7" Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.671684 4786 scope.go:117] "RemoveContainer" containerID="aa403355bcda7f16309a9c0c32f47f6f6a597753c62a68748930dd8f9db83b5d" Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.684878 4786 scope.go:117] "RemoveContainer" containerID="b34364ca4e6b3523c62b34c79ab9514503c42e2b75e944085ed93246bbb2c603" Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.702433 4786 scope.go:117] "RemoveContainer" containerID="f57f539d3be9e3ca38f5d784f97fac51de9631d5b2111e5c8267d207315e621a" Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.717744 4786 scope.go:117] "RemoveContainer" containerID="aa403355bcda7f16309a9c0c32f47f6f6a597753c62a68748930dd8f9db83b5d" Oct 02 07:41:49 crc kubenswrapper[4786]: E1002 07:41:49.717986 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa403355bcda7f16309a9c0c32f47f6f6a597753c62a68748930dd8f9db83b5d\": container with ID starting with aa403355bcda7f16309a9c0c32f47f6f6a597753c62a68748930dd8f9db83b5d not found: ID does not exist" containerID="aa403355bcda7f16309a9c0c32f47f6f6a597753c62a68748930dd8f9db83b5d" Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.718013 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa403355bcda7f16309a9c0c32f47f6f6a597753c62a68748930dd8f9db83b5d"} err="failed to get container status \"aa403355bcda7f16309a9c0c32f47f6f6a597753c62a68748930dd8f9db83b5d\": rpc error: code = NotFound desc = could not find container \"aa403355bcda7f16309a9c0c32f47f6f6a597753c62a68748930dd8f9db83b5d\": container with ID starting with aa403355bcda7f16309a9c0c32f47f6f6a597753c62a68748930dd8f9db83b5d not found: ID does not exist" Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.718030 4786 scope.go:117] "RemoveContainer" containerID="b34364ca4e6b3523c62b34c79ab9514503c42e2b75e944085ed93246bbb2c603" Oct 02 07:41:49 crc kubenswrapper[4786]: E1002 07:41:49.718280 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b34364ca4e6b3523c62b34c79ab9514503c42e2b75e944085ed93246bbb2c603\": container with ID starting with b34364ca4e6b3523c62b34c79ab9514503c42e2b75e944085ed93246bbb2c603 not found: ID does not exist" containerID="b34364ca4e6b3523c62b34c79ab9514503c42e2b75e944085ed93246bbb2c603" Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.718300 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34364ca4e6b3523c62b34c79ab9514503c42e2b75e944085ed93246bbb2c603"} err="failed to get container status \"b34364ca4e6b3523c62b34c79ab9514503c42e2b75e944085ed93246bbb2c603\": rpc error: code = NotFound desc = could not find container \"b34364ca4e6b3523c62b34c79ab9514503c42e2b75e944085ed93246bbb2c603\": container with ID starting with b34364ca4e6b3523c62b34c79ab9514503c42e2b75e944085ed93246bbb2c603 not found: ID does not exist" Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.718313 4786 scope.go:117] "RemoveContainer" containerID="f57f539d3be9e3ca38f5d784f97fac51de9631d5b2111e5c8267d207315e621a" Oct 02 07:41:49 crc kubenswrapper[4786]: E1002 07:41:49.718487 4786 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f57f539d3be9e3ca38f5d784f97fac51de9631d5b2111e5c8267d207315e621a\": container with ID starting with f57f539d3be9e3ca38f5d784f97fac51de9631d5b2111e5c8267d207315e621a not found: ID does not exist" containerID="f57f539d3be9e3ca38f5d784f97fac51de9631d5b2111e5c8267d207315e621a" Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.718504 4786 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f57f539d3be9e3ca38f5d784f97fac51de9631d5b2111e5c8267d207315e621a"} err="failed to get container status \"f57f539d3be9e3ca38f5d784f97fac51de9631d5b2111e5c8267d207315e621a\": rpc error: code = NotFound desc = could not find container \"f57f539d3be9e3ca38f5d784f97fac51de9631d5b2111e5c8267d207315e621a\": container with ID starting with f57f539d3be9e3ca38f5d784f97fac51de9631d5b2111e5c8267d207315e621a not found: ID does not exist" Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.763112 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71c96198-9b42-44d5-aac8-ff62f531ffa8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71c96198-9b42-44d5-aac8-ff62f531ffa8" (UID: "71c96198-9b42-44d5-aac8-ff62f531ffa8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.772402 4786 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c96198-9b42-44d5-aac8-ff62f531ffa8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 07:41:49 crc kubenswrapper[4786]: I1002 07:41:49.996349 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j9tt7"] Oct 02 07:41:50 crc kubenswrapper[4786]: I1002 07:41:50.002706 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j9tt7"] Oct 02 07:41:50 crc kubenswrapper[4786]: I1002 07:41:50.186228 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c96198-9b42-44d5-aac8-ff62f531ffa8" path="/var/lib/kubelet/pods/71c96198-9b42-44d5-aac8-ff62f531ffa8/volumes" Oct 02 07:41:50 crc kubenswrapper[4786]: I1002 07:41:50.186807 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1fd79bf-d95c-4dec-acc2-010ae2ef078a" path="/var/lib/kubelet/pods/a1fd79bf-d95c-4dec-acc2-010ae2ef078a/volumes" Oct 02 07:42:02 crc kubenswrapper[4786]: I1002 07:42:02.761579 4786 generic.go:334] "Generic (PLEG): container finished" podID="c7dd6349-7b47-45a9-bfca-e8d3080d3353" containerID="67385f0d4d5e92d84c80b095266c0e6893cf4009f30ad55355729b5dfd6f73c5" exitCode=0 Oct 02 07:42:02 crc kubenswrapper[4786]: I1002 07:42:02.761646 4786 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mp7t7/must-gather-td5lb" event={"ID":"c7dd6349-7b47-45a9-bfca-e8d3080d3353","Type":"ContainerDied","Data":"67385f0d4d5e92d84c80b095266c0e6893cf4009f30ad55355729b5dfd6f73c5"} Oct 02 07:42:02 crc kubenswrapper[4786]: I1002 07:42:02.762310 4786 scope.go:117] "RemoveContainer" containerID="67385f0d4d5e92d84c80b095266c0e6893cf4009f30ad55355729b5dfd6f73c5" Oct 02 07:42:03 crc kubenswrapper[4786]: I1002 07:42:03.292577 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mp7t7_must-gather-td5lb_c7dd6349-7b47-45a9-bfca-e8d3080d3353/gather/0.log" Oct 02 07:42:12 crc kubenswrapper[4786]: I1002 07:42:12.526440 4786 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mp7t7/must-gather-td5lb"] Oct 02 07:42:12 crc kubenswrapper[4786]: I1002 07:42:12.527004 4786 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-mp7t7/must-gather-td5lb" podUID="c7dd6349-7b47-45a9-bfca-e8d3080d3353" containerName="copy" containerID="cri-o://c1174d80cfdbae804e93da5c2a4f688ea8778ee89cd2c5b5764be478b6e1bd0b" gracePeriod=2 Oct 02 07:42:12 crc kubenswrapper[4786]: I1002 07:42:12.532505 4786 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mp7t7/must-gather-td5lb"] Oct 02 07:42:12 crc kubenswrapper[4786]: I1002 07:42:12.826358 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mp7t7_must-gather-td5lb_c7dd6349-7b47-45a9-bfca-e8d3080d3353/copy/0.log" Oct 02 07:42:12 crc kubenswrapper[4786]: I1002 07:42:12.826802 4786 generic.go:334] "Generic (PLEG): container finished" podID="c7dd6349-7b47-45a9-bfca-e8d3080d3353" containerID="c1174d80cfdbae804e93da5c2a4f688ea8778ee89cd2c5b5764be478b6e1bd0b" exitCode=143 Oct 02 07:42:12 crc kubenswrapper[4786]: I1002 07:42:12.826841 4786 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd35286938b1a116df4d36ce7e27ef4779bcd82d0551d1e6261e358a5a39890a" Oct 02 07:42:12 crc kubenswrapper[4786]: I1002 07:42:12.862542 4786 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mp7t7_must-gather-td5lb_c7dd6349-7b47-45a9-bfca-e8d3080d3353/copy/0.log" Oct 02 07:42:12 crc kubenswrapper[4786]: I1002 07:42:12.862863 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mp7t7/must-gather-td5lb" Oct 02 07:42:13 crc kubenswrapper[4786]: I1002 07:42:13.000438 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7dd6349-7b47-45a9-bfca-e8d3080d3353-must-gather-output\") pod \"c7dd6349-7b47-45a9-bfca-e8d3080d3353\" (UID: \"c7dd6349-7b47-45a9-bfca-e8d3080d3353\") " Oct 02 07:42:13 crc kubenswrapper[4786]: I1002 07:42:13.000487 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rktn7\" (UniqueName: \"kubernetes.io/projected/c7dd6349-7b47-45a9-bfca-e8d3080d3353-kube-api-access-rktn7\") pod \"c7dd6349-7b47-45a9-bfca-e8d3080d3353\" (UID: \"c7dd6349-7b47-45a9-bfca-e8d3080d3353\") " Oct 02 07:42:13 crc kubenswrapper[4786]: I1002 07:42:13.004668 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7dd6349-7b47-45a9-bfca-e8d3080d3353-kube-api-access-rktn7" (OuterVolumeSpecName: "kube-api-access-rktn7") pod "c7dd6349-7b47-45a9-bfca-e8d3080d3353" (UID: "c7dd6349-7b47-45a9-bfca-e8d3080d3353"). InnerVolumeSpecName "kube-api-access-rktn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 07:42:13 crc kubenswrapper[4786]: I1002 07:42:13.103264 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7dd6349-7b47-45a9-bfca-e8d3080d3353-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c7dd6349-7b47-45a9-bfca-e8d3080d3353" (UID: "c7dd6349-7b47-45a9-bfca-e8d3080d3353"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:42:13 crc kubenswrapper[4786]: I1002 07:42:13.103859 4786 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7dd6349-7b47-45a9-bfca-e8d3080d3353-must-gather-output\") pod \"c7dd6349-7b47-45a9-bfca-e8d3080d3353\" (UID: \"c7dd6349-7b47-45a9-bfca-e8d3080d3353\") " Oct 02 07:42:13 crc kubenswrapper[4786]: W1002 07:42:13.103989 4786 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/c7dd6349-7b47-45a9-bfca-e8d3080d3353/volumes/kubernetes.io~empty-dir/must-gather-output Oct 02 07:42:13 crc kubenswrapper[4786]: I1002 07:42:13.104015 4786 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7dd6349-7b47-45a9-bfca-e8d3080d3353-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c7dd6349-7b47-45a9-bfca-e8d3080d3353" (UID: "c7dd6349-7b47-45a9-bfca-e8d3080d3353"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 07:42:13 crc kubenswrapper[4786]: I1002 07:42:13.104491 4786 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c7dd6349-7b47-45a9-bfca-e8d3080d3353-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 02 07:42:13 crc kubenswrapper[4786]: I1002 07:42:13.104557 4786 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rktn7\" (UniqueName: \"kubernetes.io/projected/c7dd6349-7b47-45a9-bfca-e8d3080d3353-kube-api-access-rktn7\") on node \"crc\" DevicePath \"\"" Oct 02 07:42:13 crc kubenswrapper[4786]: I1002 07:42:13.833212 4786 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mp7t7/must-gather-td5lb" Oct 02 07:42:14 crc kubenswrapper[4786]: I1002 07:42:14.186909 4786 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7dd6349-7b47-45a9-bfca-e8d3080d3353" path="/var/lib/kubelet/pods/c7dd6349-7b47-45a9-bfca-e8d3080d3353/volumes" Oct 02 07:42:27 crc kubenswrapper[4786]: I1002 07:42:27.497454 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:42:27 crc kubenswrapper[4786]: I1002 07:42:27.497845 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 07:42:57 crc kubenswrapper[4786]: I1002 07:42:57.497061 4786 patch_prober.go:28] interesting pod/machine-config-daemon-p6dmq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 07:42:57 crc kubenswrapper[4786]: I1002 07:42:57.497432 4786 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-p6dmq" podUID="79cb22df-4930-4aed-9108-1056074d1000" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"